G^2D: Boosting Multimodal Learning with Gradient-Guided Distillation Jun 26, 2025 Knowledge Distillation Model Optimization
Code Code Available 0Revisiting Knowledge Distillation for Autoregressive Language Models Feb 19, 2024 Knowledge Distillation
Code Code Available 0Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation Oct 19, 2021 Knowledge Distillation Neural Network Compression
Code Code Available 0Revisiting Knowledge Distillation under Distribution Shift Dec 25, 2023 Data Augmentation Diversity
Code Code Available 0Distillation-based fabric anomaly detection Jan 4, 2024 Anomaly Detection Defect Detection
Code Code Available 0Multiple Teachers-Meticulous Student: A Domain Adaptive Meta-Knowledge Distillation Model for Medical Image Classification Mar 17, 2024 image-classification Image Classification
Code Code Available 0F-VLM: Open-Vocabulary Object Detection upon Frozen Vision and Language Models Sep 30, 2022 Knowledge Distillation object-detection
Code Code Available 0Knowledge-guided Causal Intervention for Weakly-supervised Object Localization Jan 3, 2023 Knowledge Distillation Object
Code Code Available 0Structured Knowledge Distillation for Dense Prediction Mar 11, 2019 Depth Estimation General Classification
Code Code Available 0Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation Nov 11, 2020 Graph Representation Learning Knowledge Distillation
Code Code Available 0Multi-source-free Domain Adaptation via Uncertainty-aware Adaptive Distillation Feb 9, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation Jun 19, 2024 Knowledge Distillation
Code Code Available 0Multistage Collaborative Knowledge Distillation from a Large Language Model for Semi-Supervised Sequence Generation Nov 15, 2023 Constituency Parsing Knowledge Distillation
Code Code Available 0Structured Knowledge Distillation for Semantic Segmentation Jun 1, 2019 General Classification image-classification
Code Code Available 0Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Sep 13, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 0Revisiting Knowledge Distillation via Label Smoothing Regularization Sep 25, 2019 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0Towards Class-wise Fair Adversarial Training via Anti-Bias Soft Label Distillation Jun 10, 2025 Adversarial Robustness Fairness
Code Code Available 0WaterMono: Teacher-Guided Anomaly Masking and Enhancement Boosting for Robust Underwater Self-Supervised Monocular Depth Estimation Jun 19, 2024 Depth Estimation Image Enhancement
Code Code Available 0Disentangling spatio-temporal knowledge for weakly supervised object detection and segmentation in surgical video Jul 22, 2024 Disentanglement Knowledge Distillation
Code Code Available 0FS-BAN: Born-Again Networks for Domain Generalization Few-Shot Classification Aug 23, 2022 Domain Generalization Knowledge Distillation
Code Code Available 0From underwater to aerial: a novel multi-scale knowledge distillation approach for coral reef monitoring Feb 25, 2025 Knowledge Distillation
Code Code Available 0Preference-Consistent Knowledge Distillation for Recommender System Nov 8, 2023 Knowledge Distillation Recommendation Systems
Code Code Available 0CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective Apr 22, 2024 Contrastive Learning image-classification
Code Code Available 0Multi-Teacher Knowledge Distillation For Text Image Machine Translation May 9, 2023 Decoder Knowledge Distillation
Code Code Available 0Multi Teacher Privileged Knowledge Distillation for Multimodal Expression Recognition Aug 16, 2024 Emotion Recognition Knowledge Distillation
Code Code Available 0Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation Apr 28, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 0Right Time to Learn:Promoting Generalization via Bio-inspired Spacing Effect in Knowledge Distillation Feb 10, 2025 Knowledge Distillation
Code Code Available 0Chemical transformer compression for accelerating both training and inference of molecular modeling May 16, 2022 Knowledge Distillation Model Compression
Code Code Available 0Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning Sep 16, 2024 Few-Shot Learning image-classification
Code Code Available 0An Efficient End-to-End Approach to Noise Invariant Speech Features via Multi-Task Learning Mar 13, 2024 Denoising Knowledge Distillation
Code Code Available 0Frameless Graph Knowledge Distillation Jul 13, 2023 Graph Representation Learning Knowledge Distillation
Code Code Available 0Discourse Structures Guided Fine-grained Propaganda Identification Oct 28, 2023 Attribute Knowledge Distillation
Code Code Available 0MiniDisc: Minimal Distillation Schedule for Language Model Compression May 29, 2022 Knowledge Distillation Language Modeling
Code Code Available 0FractalAD: A simple industrial anomaly detection method using fractal anomaly generation and backbone knowledge distillation Jan 30, 2023 Anomaly Detection Knowledge Distillation
Code Code Available 0Student Becomes Decathlon Master in Retinal Vessel Segmentation via Dual-teacher Multi-target Domain Adaptation Mar 7, 2022 Domain Adaptation Domain Generalization
Code Code Available 0Robust and Accurate Object Detection via Self-Knowledge Distillation Nov 14, 2021 Adversarial Robustness Knowledge Distillation
Code Code Available 0Mutli-View 3D Reconstruction using Knowledge Distillation Dec 2, 2024 3D Reconstruction Depth Estimation
Code Code Available 0Exploring Generalizable Distillation for Efficient Medical Image Segmentation Jul 26, 2022 Decoder Image Segmentation
Code Code Available 0DiSCo: LLM Knowledge Distillation for Efficient Sparse Retrieval in Conversational Search Oct 18, 2024 Conversational Information Access Conversational Search
Code Code Available 0Digital Staining with Knowledge Distillation: A Unified Framework for Unpaired and Paired-But-Misaligned Data Apr 14, 2025 Colorization Knowledge Distillation
Code Code Available 0Analyzing the Confidentiality of Undistillable Teachers in Knowledge Distillation Dec 1, 2021 Knowledge Distillation
Code Code Available 0Mutual-Learning Knowledge Distillation for Nighttime UAV Tracking Dec 13, 2023 Knowledge Distillation
Code Code Available 0A Unified Object Counting Network with Object Occupation Prior Dec 29, 2022 Crowd Counting Knowledge Distillation
Code Code Available 0Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation Oct 1, 2021 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0Towards Data-Free Domain Generalization Oct 9, 2021 Data-free Knowledge Distillation Domain Generalization
Code Code Available 0MV-MR: multi-views and multi-representations for self-supervised learning and knowledge distillation Mar 21, 2023 Clustering Contrastive Learning
Code Code Available 0Robust Knowledge Distillation Based on Feature Variance Against Backdoored Teacher Model Jun 1, 2024 Knowledge Distillation Model Compression
Code Code Available 0Accelerated Proton Resonance Frequency-based Magnetic Resonance Thermometry by Optimized Deep Learning Method Jul 3, 2024 Knowledge Distillation
Code Code Available 0Foundation Models for Structural Health Monitoring Apr 3, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Natural Language Generation for Effective Knowledge Distillation Nov 1, 2019 Knowledge Distillation Linguistic Acceptability
Code Code Available 0