SCAN: A Scalable Neural Networks Framework Towards Compact and Efficient Models May 27, 2019 Knowledge Distillation
Code Code Available 0On Exploring Pose Estimation as an Auxiliary Learning Task for Visible-Infrared Person Re-identification Jan 11, 2022 Auxiliary Learning Knowledge Distillation
Code Code Available 0Weakly Supervised Change Detection via Knowledge Distillation and Multiscale Sigmoid Inference Mar 9, 2024 Change Detection Knowledge Distillation
Code Code Available 0Low-Cost Self-Ensembles Based on Multi-Branch Transformation and Grouped Convolution Aug 5, 2024 Classification Diversity
Code Code Available 0FedBrain-Distill: Communication-Efficient Federated Brain Tumor Classification Using Ensemble Knowledge Distillation on Non-IID Data Sep 9, 2024 Brain Tumor Classification Federated Learning
Code Code Available 0A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity Recognition Apr 2, 2022 Contrastive Learning Cross-Lingual NER
Code Code Available 0SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data Aug 28, 2023 Face Recognition Knowledge Distillation
Code Code Available 0Synthetic data generation method for data-free knowledge distillation in regression neural networks Jan 11, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0FedBKD: Distilled Federated Learning to Embrace Gerneralization and Personalization on Non-IID Data Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0Online Adversarial Knowledge Distillation for Graph Neural Networks Dec 28, 2021 Knowledge Distillation
Code Code Available 0Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks Sep 19, 2024 Knowledge Distillation
Code Code Available 0Towards Low-Latency Event Stream-based Visual Object Tracking: A Slow-Fast Approach May 19, 2025 Knowledge Distillation Object Tracking
Code Code Available 0Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0SCJD: Sparse Correlation and Joint Distillation for Efficient 3D Human Pose Estimation Mar 18, 2025 3D Human Pose Estimation Knowledge Distillation
Code Code Available 0SCKD: Semi-Supervised Cross-Modality Knowledge Distillation for 4D Radar Object Detection Dec 19, 2024 3D Object Detection Autonomous Vehicles
Code Code Available 0Online Ensemble Model Compression using Knowledge Distillation Nov 15, 2020 Knowledge Distillation model
Code Code Available 0Understanding the Effect of Model Compression on Social Bias in Large Language Models Dec 9, 2023 Knowledge Distillation Model Compression
Code Code Available 0Feature Representation Learning for Robust Retinal Disease Detection from Optical Coherence Tomography Images Jun 24, 2022 Decoder Knowledge Distillation
Code Code Available 0Feature Normalized Knowledge Distillation for Image Classification Aug 1, 2020 Classification General Classification
Code Code Available 0An Embarrassingly Simple Approach for Knowledge Distillation Dec 5, 2018 Face Recognition Knowledge Distillation
Code Code Available 0Declarative Knowledge Distillation from Large Language Models for Visual Question Answering Datasets Oct 12, 2024 Knowledge Distillation Question Answering
Code Code Available 0Feature Fusion for Online Mutual Knowledge Distillation Apr 19, 2019 Knowledge Distillation
Code Code Available 0Online Knowledge Distillation with Diverse Peers Dec 1, 2019 Knowledge Distillation Transfer Learning
Code Code Available 0Dealing With Heterogeneous 3D MR Knee Images: A Federated Few-Shot Learning Method With Dual Knowledge Distillation Mar 25, 2023 Federated Learning Few-Shot Learning
Code Code Available 0Towards Mitigating Architecture Overfitting on Distilled Datasets Sep 8, 2023 Dataset Distillation Knowledge Distillation
Code Code Available 0Bridging the Gap between Decision and Logits in Decision-based Knowledge Distillation for Pre-trained Language Models Jun 15, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0Faster gaze prediction with dense networks and Fisher pruning Jan 17, 2018 Gaze Estimation Gaze Prediction
Code Code Available 0AMR-Evol: Adaptive Modular Response Evolution Elicits Better Knowledge Distillation for Large Language Models in Code Generation Oct 1, 2024 Code Generation HumanEval
Code Code Available 0FastAST: Accelerating Audio Spectrogram Transformer via Token Merging and Cross-Model Knowledge Distillation Jun 11, 2024 Audio Classification Knowledge Distillation
Code Code Available 0On Membership Inference Attacks in Knowledge Distillation May 17, 2025 Knowledge Distillation Privacy Preserving
Code Code Available 0TAKE: Topic-shift Aware Knowledge sElection for Dialogue Generation Oct 1, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 0Towards Multi-Morphology Controllers with Diversity and Knowledge Distillation Apr 22, 2024 Diversity Knowledge Distillation
Code Code Available 0VECT-GAN: A variationally encoded generative model for overcoming data scarcity in pharmaceutical science Jan 15, 2025 Generative Adversarial Network Knowledge Distillation
Code Code Available 0Fantastic Gains and Where to Find Them: On the Existence and Prospect of General Knowledge Transfer between Any Pretrained Model Oct 26, 2023 Data Augmentation General Knowledge
Code Code Available 0DCA: Dividing and Conquering Amnesia in Incremental Object Detection Mar 19, 2025 Knowledge Distillation object-detection
Code Code Available 0SecFormer: Fast and Accurate Privacy-Preserving Inference for Transformer Models via SMPC Jan 1, 2024 Knowledge Distillation Privacy Preserving
Code Code Available 0Bridging Modalities: Knowledge Distillation and Masked Training for Translating Multi-Modal Emotion Recognition to Uni-Modal, Speech-Only Emotion Recognition Jan 4, 2024 Emotion Recognition Knowledge Distillation
Code Code Available 0On the Byzantine-Resilience of Distillation-Based Federated Learning Feb 19, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Multi-Teacher Language-Aware Knowledge Distillation for Multilingual Speech Emotion Recognition Jun 10, 2025 Emotion Recognition Knowledge Distillation
Code Code Available 0Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study Nov 8, 2022 Attribute Data Augmentation
Code Code Available 0Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation May 16, 2025 Knowledge Distillation
Code Code Available 0FANFOLD: Graph Normalizing Flows-driven Asymmetric Network for Unsupervised Graph-Level Anomaly Detection Jun 29, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Data Upcycling Knowledge Distillation for Image Super-Resolution Sep 25, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 0On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals Jul 30, 2021 Clustering Contrastive Learning
Code Code Available 0AMLNet: Adversarial Mutual Learning Neural Network for Non-AutoRegressive Multi-Horizon Time Series Forecasting Oct 30, 2023 Decoder Diversity
Code Code Available 0On the Generalization vs Fidelity Paradox in Knowledge Distillation May 21, 2025 Knowledge Distillation Transfer Learning
Code Code Available 0Segmenting the Future Apr 24, 2019 Autonomous Driving Decision Making
Code Code Available 0SeizureNet: Multi-Spectral Deep Feature Learning for Seizure Type Classification Mar 8, 2019 Classification EEG
Code Code Available 0Attention-Based Depth Distillation with 3D-Aware Positional Encoding for Monocular 3D Object Detection Nov 30, 2022 3D Object Detection Depth Estimation
Code Code Available 0Bridging Dimensions: Confident Reachability for High-Dimensional Controllers Nov 8, 2023 Knowledge Distillation OpenAI Gym
Code Code Available 0