A Fast Knowledge Distillation Framework for Visual Recognition Dec 2, 2021 image-classification Image Classification
Code Code Available 15 Distilling Holistic Knowledge with Graph Neural Networks Aug 12, 2021 Knowledge Distillation
Code Code Available 15 Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 15 Distilling Image Classifiers in Object Detectors Jun 9, 2021 Knowledge Distillation Object
Code Code Available 15 Geometer: Graph Few-Shot Class-Incremental Learning via Prototype Representation May 27, 2022 class-incremental learning Class Incremental Learning
Code Code Available 15 Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 15 Prototype-based Incremental Few-Shot Semantic Segmentation Nov 30, 2020 Few-Shot Semantic Segmentation Incremental Learning
Code Code Available 15 Distilling Knowledge from Ensembles of Acoustic Models for Joint CTC-Attention End-to-End Speech Recognition May 19, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 15 Contrastive Deep Supervision Jul 12, 2022 Contrastive Learning Fine-Grained Image Classification
Code Code Available 15 COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using Transformers Sep 3, 2023 Action Detection Action Spotting
Code Code Available 15 Distilling Knowledge from Self-Supervised Teacher by Embedding Graph Alignment Nov 23, 2022 Knowledge Distillation Representation Learning
Code Code Available 15 Communication-Efficient Federated Learning through Adaptive Weight Clustering and Server-Side Distillation Jan 25, 2024 Clustering Federated Learning
Code Code Available 15 Distilling Linguistic Context for Language Model Compression Sep 17, 2021 Knowledge Distillation Language Modeling
Code Code Available 15 Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 15 GenFormer -- Generated Images are All You Need to Improve Robustness of Transformers on Small Datasets Aug 26, 2024 All Data Augmentation
Code Code Available 15 Distilling Meta Knowledge on Heterogeneous Graph for Illicit Drug Trafficker Detection on Social Media Dec 1, 2021 Knowledge Distillation Marketing
Code Code Available 15 Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks Oct 24, 2022 Knowledge Distillation Transfer Learning
Code Code Available 15 Distilling Object Detectors via Decoupled Features Mar 26, 2021 image-classification Image Classification
Code Code Available 15 Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 15 Knowledge Distillation via Route Constrained Optimization Apr 19, 2019 Face Recognition Knowledge Distillation
Code Code Available 15 Grad-CAM++: Improved Visual Explanations for Deep Convolutional Networks Oct 30, 2017 3D Action Recognition Action Recognition
Code Code Available 15 Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation May 19, 2021 Image Classification Knowledge Distillation
Code Code Available 15 Continual Learning for LiDAR Semantic Segmentation: Class-Incremental and Coarse-to-Fine strategies on Sparse Data Apr 8, 2023 class-incremental learning Class Incremental Learning
Code Code Available 15 Distilling the Knowledge in a Neural Network Mar 9, 2015 Knowledge Distillation Mixture-of-Experts
Code Code Available 15 Complementary Relation Contrastive Distillation Mar 29, 2021 Knowledge Distillation Relation
Code Code Available 15 LabelDistill: Label-guided Cross-modal Knowledge Distillation for Camera-based 3D Object Detection Jul 14, 2024 3D Object Detection Depth Estimation
Code Code Available 15 Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 15 Label Poisoning is All You Need Oct 29, 2023 All Backdoor Attack
Code Code Available 15 Generative Bias for Robust Visual Question Answering Aug 1, 2022 Knowledge Distillation Question Answering
Code Code Available 15 A Discrepancy Aware Framework for Robust Anomaly Detection Oct 11, 2023 Anomaly Detection Decoder
Code Code Available 15 Continual Learning for Image Segmentation with Dynamic Query Nov 29, 2023 Continual Learning Diversity
Code Code Available 15 Comprehensive Knowledge Distillation with Causal Intervention Dec 1, 2021 Causal Inference Knowledge Distillation
Code Code Available 15 Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 15 Distill the Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation Oct 10, 2022 Knowledge Distillation Machine Translation
Code Code Available 15 DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts May 10, 2022 Dimensionality Reduction Knowledge Distillation
Code Code Available 15 DistilPose: Tokenized Pose Regression with Heatmap Distillation Mar 4, 2023 Knowledge Distillation Pose Estimation
Code Code Available 15 Generative Model-based Feature Knowledge Distillation for Action Recognition Dec 14, 2023 Action Detection Action Recognition
Code Code Available 15 AgeFlow: Conditional Age Progression and Regression with Normalizing Flows May 15, 2021 Attribute Knowledge Distillation
Code Code Available 15 Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors May 28, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 15 Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles Mar 5, 2021 Federated Learning Knowledge Distillation
Code Code Available 15 ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence May 7, 2025 Knowledge Distillation
Code Code Available 15 DisWOT: Student Architecture Search for Distillation WithOut Training Mar 28, 2023 Knowledge Distillation
Code Code Available 15 Distribution-aware Knowledge Prototyping for Non-exemplar Lifelong Person Re-identification Jan 1, 2024 Diversity Knowledge Distillation
Code Code Available 15 Distribution-aware Forgetting Compensation for Exemplar-Free Lifelong Person Re-identification Apr 21, 2025 Exemplar-Free Knowledge Distillation
Code Code Available 15 Continual evaluation for lifelong learning: Identifying the stability gap May 26, 2022 Continual Learning Incremental Learning
Code Code Available 15 DKDL-Net: A Lightweight Bearing Fault Detection Model via Decoupled Knowledge Distillation and Low-Rank Adaptation Fine-tuning Jun 10, 2024 Fault Detection Fault Diagnosis
Code Code Available 15 DM-VTON: Distilled Mobile Real-time Virtual Try-On Aug 26, 2023 GPU Human Parsing
Code Code Available 15 Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 15 DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval Jun 24, 2021 Computational Efficiency Knowledge Distillation
Code Code Available 15 Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 15