Distilling the Undistillable: Learning from a Nasty Teacher Oct 21, 2022 Knowledge Distillation
Code Code Available 0Tiny Updater: Towards Efficient Neural Network-Driven Software Updating Jan 1, 2023 Efficient Neural Network image-classification
Code Code Available 0AdaGMLP: AdaBoosting GNN-to-MLP Knowledge Distillation May 23, 2024 Knowledge Distillation
Code Code Available 0Induced Model Matching: Restricted Models Help Train Full-Featured Models Jan 15, 2025 Knowledge Distillation Language Modeling
Code Code Available 0Induced Model Matching: How Restricted Models Can Help Larger Ones Feb 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Weight Copy and Low-Rank Adaptation for Few-Shot Distillation of Vision Transformers Apr 14, 2024 Knowledge Distillation
Code Code Available 0Spatial-Channel Token Distillation for Vision MLPs Jul 23, 2022 Image Classification Knowledge Distillation
Code Code Available 0Masked Student Dataset of Expressions Apr 7, 2023 Contrastive Learning Facial Expression Recognition
Code Code Available 0InDistill: Information flow-preserving knowledge distillation for model compression May 20, 2022 Knowledge Distillation Model Compression
Code Code Available 0COMBHelper: A Neural Approach to Reduce Search Space for Graph Combinatorial Problems Dec 14, 2023 Combinatorial Optimization Graph Neural Network
Code Code Available 0Distilling Knowledge by Mimicking Features Nov 3, 2020 Knowledge Distillation object-detection
Code Code Available 0Reciprocal Supervised Learning Improves Neural Machine Translation Dec 5, 2020 image-classification Image Classification
Code Code Available 0Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism Apr 30, 2024 Data Augmentation Diversity
Code Code Available 0Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition Nov 9, 2021 Continual Learning Knowledge Distillation
Code Code Available 0MCC-KD: Multi-CoT Consistent Knowledge Distillation Oct 23, 2023 Diversity Knowledge Distillation
Code Code Available 0To Distill or Not to Distill? On the Robustness of Robust Knowledge Distillation Jun 6, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0UNIKD: UNcertainty-filtered Incremental Knowledge Distillation for Neural Implicit Representation Dec 21, 2022 3D Reconstruction Incremental Learning
Code Code Available 0Incorporating Graph Information in Transformer-based AMR Parsing Jun 23, 2023 Abstract Meaning Representation AMR Parsing
Code Code Available 0An Empirical Study of Pre-trained Language Models in Simple Knowledge Graph Question Answering Mar 18, 2023 Graph Question Answering Knowledge Distillation
Code Code Available 0Improving Stance Detection with Multi-Dataset Learning and Knowledge Distillation Nov 1, 2021 Knowledge Distillation Stance Detection
Code Code Available 0Improving Respiratory Sound Classification with Architecture-Agnostic Knowledge Distillation from Ensembles May 28, 2025 Knowledge Distillation Sound Classification
Code Code Available 0Spatio-Temporal Branching for Motion Prediction using Motion Increments Aug 2, 2023 Human motion prediction Knowledge Distillation
Code Code Available 0Improving Question Answering Performance Using Knowledge Distillation and Active Learning Sep 26, 2021 Active Learning Knowledge Distillation
Code Code Available 0MedDet: Generative Adversarial Distillation for Efficient Cervical Disc Herniation Detection Aug 30, 2024 Knowledge Distillation Model Compression
Code Code Available 0AI-KD: Towards Alignment Invariant Face Image Quality Assessment Using Knowledge Distillation Apr 15, 2024 Face Alignment Face Image Quality
Code Code Available 0Distilling the Knowledge of Romanian BERTs Using Multiple Teachers Dec 23, 2021 Dialect Identification GPU
Code Code Available 0Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation Aug 28, 2021 Knowledge Distillation Retrieval
Code Code Available 0Unsupervised Domain Expansion for Visual Categorization Apr 1, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Improving Neural Topic Models with Wasserstein Knowledge Distillation Mar 27, 2023 Knowledge Distillation Topic Models
Code Code Available 0SpectralKD: A Unified Framework for Interpreting and Distilling Vision Transformers via Spectral Analysis Dec 26, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0Improving Neural Architecture Search Image Classifiers via Ensemble Learning Mar 14, 2019 Ensemble Learning Image Classification
Code Code Available 0MEND: Meta dEmonstratioN Distillation for Efficient and Effective In-Context Learning Mar 11, 2024 Decoder In-Context Learning
Code Code Available 0Redefining Normal: A Novel Object-Level Approach for Multi-Object Novelty Detection Dec 15, 2024 Knowledge Distillation Novelty Detection
Code Code Available 0Improving Knowledge Distillation via Transferring Learning Ability Apr 24, 2023 Knowledge Distillation
Code Code Available 0Adaptive Prompt Learning with Distilled Connective Knowledge for Implicit Discourse Relation Recognition Sep 14, 2023 Knowledge Distillation Prompt Learning
Code Code Available 0Redistributing Low-Frequency Words: Making the Most of Monolingual Data in Non-Autoregressive Translation May 1, 2022 Knowledge Distillation Translation
Code Code Available 0Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting Jun 11, 2022 Computational Efficiency Crowd Counting
Code Code Available 0Improving generalizability of distilled self-supervised speech processing models under distorted settings Oct 14, 2022 Knowledge Distillation
Code Code Available 0Reducing Spatial Fitting Error in Distillation of Denoising Diffusion Models Nov 7, 2023 Attribute Denoising
Code Code Available 0Improving End-to-End Speech Translation by Imitation-Based Knowledge Distillation with Synthetic Transcripts Jul 17, 2023 automatic-speech-translation Imitation Learning
Code Code Available 0Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation Dec 7, 2021 Auxiliary Learning Knowledge Distillation
Code Code Available 0Autoregressive Knowledge Distillation through Imitation Learning Sep 15, 2020 Imitation Learning Knowledge Distillation
Code Code Available 0An Efficient Memory Module for Graph Few-Shot Class-Incremental Learning Nov 11, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0Improving Robustness by Enhancing Weak Subnets Jan 30, 2022 Adversarial Robustness Data Augmentation
Code Code Available 0Improving Adversarial Robust Fairness via Anti-Bias Soft Label Distillation Dec 9, 2023 Adversarial Robustness Fairness
Code Code Available 0Improved Knowledge Distillation via Teacher Assistant Feb 9, 2019 Knowledge Distillation
Code Code Available 0Collective Relevance Labeling for Passage Retrieval May 6, 2022 Information Retrieval Knowledge Distillation
Code Code Available 0Improved Knowledge Distillation for Crowd Counting on IoT Device Aug 2, 2023 Crowd Counting Knowledge Distillation
Code Code Available 0IE-GAN: An Improved Evolutionary Generative Adversarial Network Using a New Fitness Function and a Generic Crossover Operator Jul 25, 2021 Evolutionary Algorithms Generative Adversarial Network
Code Code Available 0Distilling Stereo Networks for Performant and Efficient Leaner Networks Mar 24, 2025 General Knowledge Knowledge Distillation
Code Code Available 0