Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation May 23, 2022 image-classification Image Classification
Code Code Available 1PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection May 23, 2022 3D Object Detection Knowledge Distillation
Code Code Available 1IDEAL: Query-Efficient Data-Free Learning from Black-box Models May 23, 2022 Knowledge Distillation
Code Code Available 1Knowledge Distillation via the Target-aware Transformer May 22, 2022 Knowledge Distillation
Code Code Available 1Knowledge Distillation from A Stronger Teacher May 21, 2022 image-classification Image Classification
Code Code Available 1Exploring Extreme Parameter Compression for Pre-trained Language Models May 20, 2022 Knowledge Distillation Tensor Decomposition
Code Code Available 1Directed Acyclic Transformer for Non-Autoregressive Machine Translation May 16, 2022 Knowledge Distillation Machine Translation
Code Code Available 1Knowledge Distillation Meets Open-Set Semi-Supervised Learning May 13, 2022 Face Recognition Knowledge Distillation
Code Code Available 1DistilProtBert: A distilled protein language model used to distinguish between real proteins and their randomly shuffled counterparts May 10, 2022 Dimensionality Reduction Knowledge Distillation
Code Code Available 1Spot-adaptive Knowledge Distillation May 5, 2022 Knowledge Distillation
Code Code Available 1Nearest Neighbor Knowledge Distillation for Neural Machine Translation May 1, 2022 Knowledge Distillation Machine Translation
Code Code Available 1Curriculum Learning for Dense Retrieval Distillation Apr 28, 2022 Knowledge Distillation Passage Retrieval
Code Code Available 1Conformer and Blind Noisy Students for Improved Image Quality Assessment Apr 27, 2022 Image Quality Assessment Image Restoration
Code Code Available 1Proto2Proto: Can you recognize the car, the way I do? Apr 25, 2022 Knowledge Distillation
Code Code Available 1On-Device Next-Item Recommendation with Self-Supervised Knowledge Distillation Apr 23, 2022 Knowledge Distillation Recommendation Systems
Code Code Available 1Eliminating Backdoor Triggers for Deep Neural Networks Using Attention Relation Graph Distillation Apr 21, 2022 backdoor defense Knowledge Distillation
Code Code Available 1Modeling Missing Annotations for Incremental Learning in Object Detection Apr 19, 2022 Incremental Learning Instance Segmentation
Code Code Available 1DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation Apr 19, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 1MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation Apr 15, 2022 Knowledge Distillation Mixture-of-Experts
Code Code Available 1LRH-Net: A Multi-Level Knowledge Distillation Approach for Low-Resource Heart Network Apr 11, 2022 Knowledge Distillation
Code Code Available 1Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation Apr 5, 2022 Class-Incremental Object Detection Incremental Learning
Code Code Available 1Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation Apr 2, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1End-to-End Zero-Shot HOI Detection via Vision and Language Knowledge Distillation Apr 1, 2022 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 1Distill-VQ: Learning Retrieval Oriented Vector Quantization By Distilling Knowledge from Dense Embeddings Apr 1, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 1Feature Structure Distillation with Centered Kernel Alignment in BERT Transferring Apr 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 1It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher Mar 31, 2022 All Data Free Quantization
Code Code Available 1Self-Distillation from the Last Mini-Batch for Consistency Regularization Mar 30, 2022 Knowledge Distillation
Code Code Available 1Rainbow Keywords: Efficient Incremental Learning for Online Spoken Keyword Spotting Mar 30, 2022 Data Augmentation Diversity
Code Code Available 1Monitored Distillation for Positive Congruent Depth Completion Mar 30, 2022 Depth Completion Image Reconstruction
Code Code Available 1Instance Relation Graph Guided Source-Free Domain Adaptive Object Detection Mar 29, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 1Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation Mar 26, 2022 Contrastive Learning image-classification
Code Code Available 1Knowledge Distillation with the Reused Teacher Classifier Mar 26, 2022 Knowledge Distillation
Code Code Available 1PCA-Based Knowledge Distillation Towards Lightweight and Content-Style Balanced Photorealistic Style Transfer Models Mar 25, 2022 Knowledge Distillation Style Transfer
Code Code Available 1Model LEGO: Creating Models Like Disassembling and Assembling Building Blocks Mar 25, 2022 Incremental Learning Knowledge Distillation
Code Code Available 1Rich Feature Construction for the Optimization-Generalization Dilemma Mar 24, 2022 Inductive Bias Knowledge Distillation
Code Code Available 1Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction Mar 24, 2022 Grammatical Error Correction Knowledge Distillation
Code Code Available 1R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning Mar 24, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images Mar 22, 2022 Knowledge Distillation Lesion Classification
Code Code Available 1DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization Mar 21, 2022 Knowledge Distillation Model Compression
Code Code Available 1Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation Mar 21, 2022 Document-level Relation Extraction Knowledge Distillation
Code Code Available 1Open-Vocabulary One-Stage Detection with Hierarchical Visual-Language Knowledge Distillation Mar 20, 2022 Knowledge Distillation Language Modelling
Code Code Available 1Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning Mar 17, 2022 Data-free Knowledge Distillation Federated Learning
Code Code Available 1When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Mar 17, 2022 Data Augmentation HellaSwag
Code Code Available 1Graph Flow: Cross-layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation Mar 16, 2022 Image Segmentation Knowledge Distillation
Code Code Available 1SATS: Self-Attention Transfer for Continual Semantic Segmentation Mar 15, 2022 Continual Semantic Segmentation Knowledge Distillation
Code Code Available 1Unified Visual Transformer Compression Mar 15, 2022 Knowledge Distillation
Code Code Available 1Representation Compensation Networks for Continual Semantic Segmentation Mar 10, 2022 Class Incremental Learning Continual Learning
Code Code Available 1Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability Mar 10, 2022 Knowledge Distillation
Code Code Available 1Prediction-Guided Distillation for Dense Object Detection Mar 10, 2022 Dense Object Detection Knowledge Distillation
Code Code Available 1Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation Mar 8, 2022 Continual Learning Knowledge Distillation
Code Code Available 1