CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing May 24, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition Nov 16, 2021 Cross-Lingual NER Knowledge Distillation
Code Code Available 05 HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 05 CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation Jul 6, 2021 Continual Learning Domain Adaptation
Code Code Available 05 Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 05 Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 PyNET-QxQ: An Efficient PyNET Variant for QxQ Bayer Pattern Demosaicing in CMOS Image Sensors Mar 8, 2022 Demosaicking Knowledge Distillation
Code Code Available 05 Distilling Stereo Networks for Performant and Efficient Leaner Networks Mar 24, 2025 General Knowledge Knowledge Distillation
Code Code Available 05 Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and Fusion Jul 23, 2022 Data-free Knowledge Distillation Fairness
Code Code Available 05 HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 05 Correlation Congruence for Knowledge Distillation Apr 3, 2019 Face Recognition image-classification
Code Code Available 05 GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 05 Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 05 Feature Fusion for Online Mutual Knowledge Distillation Apr 19, 2019 Knowledge Distillation
Code Code Available 05 GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 05 Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 05 Answering Diverse Questions via Text Attached with Key Audio-Visual Clues Mar 11, 2024 Audio-visual Question Answering Audio-Visual Question Answering (AVQA)
Code Code Available 05 Distilling Object Detectors With Global Knowledge Oct 17, 2022 Knowledge Distillation Object
Code Code Available 05 Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing May 31, 2021 Knowledge Distillation Unsupervised Pre-training
Code Code Available 05 Feature Representation Learning for Robust Retinal Disease Detection from Optical Coherence Tomography Images Jun 24, 2022 Decoder Knowledge Distillation
Code Code Available 05 Distilling Object Detectors with Fine-grained Feature Imitation Jun 9, 2019 Knowledge Distillation Object
Code Code Available 05 Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling May 18, 2023 Knowledge Distillation
Code Code Available 05 Catastrophic Interference in Reinforcement Learning: A Solution Based on Context Division and Knowledge Distillation Sep 1, 2021 Deep Reinforcement Learning General Reinforcement Learning
Code Code Available 05 Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 05 Distilling Reasoning Capabilities into Smaller Language Models Dec 1, 2022 GSM8K Knowledge Distillation
Code Code Available 05 FedBKD: Distilled Federated Learning to Embrace Gerneralization and Personalization on Non-IID Data Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 05 FedBrain-Distill: Communication-Efficient Federated Brain Tumor Classification Using Ensemble Knowledge Distillation on Non-IID Data Sep 9, 2024 Brain Tumor Classification Federated Learning
Code Code Available 05 Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 05 Gradient Knowledge Distillation for Pre-trained Language Models Nov 2, 2022 Knowledge Distillation
Code Code Available 05 Graph Entropy Minimization for Semi-supervised Node Classification May 31, 2023 Classification Knowledge Distillation
Code Code Available 05 Distilling Model Knowledge Oct 8, 2015 Bayesian Inference BIG-bench Machine Learning
Code Code Available 05 Class incremental learning with probability dampening and cascaded gated classifier Feb 2, 2024 class-incremental learning Class Incremental Learning
Code Code Available 05 Distilling Local Texture Features for Colorectal Tissue Classification in Low Data Regimes Jan 2, 2024 Knowledge Distillation
Code Code Available 05 GOTHAM: Graph Class Incremental Learning Framework under Weak Supervision Apr 7, 2025 Attribute class-incremental learning
Code Code Available 05 FedDW: Distilling Weights through Consistency Optimization in Heterogeneous Federated Learning Dec 5, 2024 Federated Learning Knowledge Distillation
Code Code Available 05 LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum Training Apr 29, 2021 Knowledge Distillation Position
Code Code Available 05 Reinforced Knowledge Distillation for Time Series Regression Jun 21, 2024 Knowledge Distillation Model Compression
Code Code Available 05 Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation Jun 2, 2021 Knowledge Distillation Translation
Code Code Available 05 Goal-Conditioned Q-Learning as Knowledge Distillation Aug 28, 2022 Knowledge Distillation Q-Learning
Code Code Available 05 Goldfish: An Efficient Federated Unlearning Framework Apr 4, 2024 Knowledge Distillation Machine Unlearning
Code Code Available 05 CaPriDe Learning: Confidential and Private Decentralized Learning Based on Encryption-Friendly Distillation Loss Jan 1, 2023 Federated Learning Knowledge Distillation
Code Code Available 05 A Diversity-Enhanced Knowledge Distillation Model for Practical Math Word Problem Solving Jan 7, 2025 Diversity Knowledge Distillation
Code Code Available 05 GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 05 Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 05 Improved Knowledge Distillation for Crowd Counting on IoT Device Aug 2, 2023 Crowd Counting Knowledge Distillation
Code Code Available 05 Federated Incremental Named Entity Recognition Nov 18, 2024 Knowledge Distillation named-entity-recognition
Code Code Available 05 CAPEEN: Image Captioning with Early Exits and Knowledge Distillation Oct 6, 2024 Descriptive Image Captioning
Code Code Available 05 GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference Apr 8, 2021 Disease Prediction graph construction
Code Code Available 05 GKT: A Novel Guidance-Based Knowledge Transfer Framework For Efficient Cloud-edge Collaboration LLM Deployment May 30, 2024 GSM8K Knowledge Distillation
Code Code Available 05 Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction Jan 16, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 05