Inter-Domain Alignment for Predicting High-Resolution Brain Networks Using Teacher-Student Learning Oct 6, 2021 Decoder Domain Adaptation
Code Code Available 05 Dynamic Rectification Knowledge Distillation Jan 27, 2022 Edge-computing Knowledge Distillation
Code Code Available 05 Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network Apr 28, 2021 Graph Neural Network Knowledge Distillation
Code Code Available 05 Induced Model Matching: How Restricted Models Can Help Larger Ones Feb 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 05 InDistill: Information flow-preserving knowledge distillation for model compression May 20, 2022 Knowledge Distillation Model Compression
Code Code Available 05 Induced Model Matching: Restricted Models Help Train Full-Featured Models Jan 15, 2025 Knowledge Distillation Language Modeling
Code Code Available 05 Asymmetric Masked Distillation for Pre-Training Small Foundation Models Nov 6, 2023 Action Classification Action Recognition
Code Code Available 05 Learning to "Segment Anything" in Thermal Infrared Images through Knowledge Distillation with a Large Scale Dataset SATIR Apr 17, 2023 Image Segmentation Knowledge Distillation
Code Code Available 05 Closest Neighbors are Harmful for Lightweight Masked Auto-encoders Jan 1, 2025 Knowledge Distillation
Code Code Available 05 3M-Health: Multimodal Multi-Teacher Knowledge Distillation for Mental Health Detection Jul 12, 2024 Knowledge Distillation Social Media Mental Health Detection
Code Code Available 05 Complex Facial Expression Recognition Using Deep Knowledge Distillation of Basic Features Aug 11, 2023 Continual Learning Emotion Recognition
Code Code Available 05 DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition Jul 16, 2025 Benchmarking Knowledge Distillation
Code Code Available 05 Leveraging Diffusion-Based Image Variations for Robust Training on Poisoned Data Oct 10, 2023 Knowledge Distillation
Code Code Available 05 Leveraging Foundation Models via Knowledge Distillation in Multi-Object Tracking: Distilling DINOv2 Features to FairMOT Jul 25, 2024 Knowledge Distillation Multi-Object Tracking
Code Code Available 05 Distilling Knowledge by Mimicking Features Nov 3, 2020 Knowledge Distillation object-detection
Code Code Available 05 Leveraging knowledge distillation for partial multi-task learning from multiple remote sensing datasets May 24, 2024 Knowledge Distillation Multi-Task Learning
Code Code Available 05 Incorporating Graph Information in Transformer-based AMR Parsing Jun 23, 2023 Abstract Meaning Representation AMR Parsing
Code Code Available 05 Improving Stance Detection with Multi-Dataset Learning and Knowledge Distillation Nov 1, 2021 Knowledge Distillation Stance Detection
Code Code Available 05 UNIKD: UNcertainty-filtered Incremental Knowledge Distillation for Neural Implicit Representation Dec 21, 2022 3D Reconstruction Incremental Learning
Code Code Available 05 Improving Question Answering Performance Using Knowledge Distillation and Active Learning Sep 26, 2021 Active Learning Knowledge Distillation
Code Code Available 05 Improving Respiratory Sound Classification with Architecture-Agnostic Knowledge Distillation from Ensembles May 28, 2025 Knowledge Distillation Sound Classification
Code Code Available 05 A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training May 3, 2023 Knowledge Distillation Text Generation
Code Code Available 05 Adversarial Moment-Matching Distillation of Large Language Models Jun 5, 2024 Imitation Learning Instruction Following
Code Code Available 05 Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition Nov 9, 2021 Continual Learning Knowledge Distillation
Code Code Available 05 Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism Oct 19, 2020 Decoder Knowledge Distillation
Code Code Available 05 Improving Neural Architecture Search Image Classifiers via Ensemble Learning Mar 14, 2019 Ensemble Learning Image Classification
Code Code Available 05 CLIMB-3D: Continual Learning for Imbalanced 3D Instance Segmentation Feb 24, 2025 3D Instance Segmentation Continual Learning
Code Code Available 05 Improving generalizability of distilled self-supervised speech processing models under distorted settings Oct 14, 2022 Knowledge Distillation
Code Code Available 05 Improving Knowledge Distillation via Transferring Learning Ability Apr 24, 2023 Knowledge Distillation
Code Code Available 05 Improving Robustness by Enhancing Weak Subnets Jan 30, 2022 Adversarial Robustness Data Augmentation
Code Code Available 05 Improving Adversarial Robust Fairness via Anti-Bias Soft Label Distillation Dec 9, 2023 Adversarial Robustness Fairness
Code Code Available 05 Dual Correction Strategy for Ranking Distillation in Top-N Recommender System Sep 8, 2021 Knowledge Distillation Recommendation Systems
Code Code Available 05 Improving End-to-End Speech Translation by Imitation-Based Knowledge Distillation with Synthetic Transcripts Jul 17, 2023 automatic-speech-translation Imitation Learning
Code Code Available 05 Improved Knowledge Distillation via Teacher Assistant Feb 9, 2019 Knowledge Distillation
Code Code Available 05 Improved Knowledge Distillation for Crowd Counting on IoT Device Aug 2, 2023 Crowd Counting Knowledge Distillation
Code Code Available 05 IE-GAN: An Improved Evolutionary Generative Adversarial Network Using a New Fitness Function and a Generic Crossover Operator Jul 25, 2021 Evolutionary Algorithms Generative Adversarial Network
Code Code Available 05 Improving Neural Topic Models with Wasserstein Knowledge Distillation Mar 27, 2023 Knowledge Distillation Topic Models
Code Code Available 05 Locally Differentially Private Distributed Deep Learning via Knowledge Distillation Feb 7, 2022 Deep Learning Knowledge Distillation
Code Code Available 05 KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 05 DSMix: Distortion-Induced Sensitivity Map Based Pre-training for No-Reference Image Quality Assessment Jul 4, 2024 Data Augmentation Image Quality Assessment
Code Code Available 05 DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models Sep 23, 2024 Knowledge Distillation Transfer Learning
Code Code Available 05 DS_FusionNet: Dynamic Dual-Stream Fusion with Bidirectional Knowledge Distillation for Plant Disease Recognition Apr 29, 2025 Fine-Grained Image Classification image-classification
Code Code Available 05 Hybrid Attention Model Using Feature Decomposition and Knowledge Distillation for Glucose Forecasting Nov 16, 2024 Knowledge Distillation
Code Code Available 05 DROP: Poison Dilution via Knowledge Distillation for Federated Learning Feb 10, 2025 Data Poisoning Federated Learning
Code Code Available 05 AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search Jan 13, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 05 Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 05 Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 05 Enhancing New-item Fairness in Dynamic Recommender Systems Apr 30, 2025 Fairness Knowledge Distillation
Code Code Available 05 HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 05 Do You Remember . . . the Future? Weak-to-Strong generalization in 3D Object Detection Aug 3, 2024 3D Object Detection Knowledge Distillation
Code Code Available 05