SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoning Oct 24, 2024 Knowledge Distillation Mathematical Reasoning
Code Code Available 0Unlearning Backdoor Attacks for LLMs with Weak-to-Strong Knowledge Distillation Oct 18, 2024 Backdoor Attack Knowledge Distillation
Code Code Available 0Less-supervised learning with knowledge distillation for sperm morphology analysis May 8, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Knowledge Distillation Performs Partial Variance Reduction May 27, 2023 Knowledge Distillation
Code Code Available 0Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge Distillation Jun 13, 2022 image-classification Image Classification
Code Code Available 0Applying Knowledge Distillation to Improve Weed Mapping With Drones Oct 8, 2023 Knowledge Distillation Management
Code Code Available 0Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation Jun 15, 2021 Fairness Knowledge Distillation
Code Code Available 0DynaMMo: Dynamic Model Merging for Efficient Class Incremental Learning for Medical Images Apr 22, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0Continual Contrastive Learning for Image Classification Jul 5, 2021 Classification Continual Learning
Code Code Available 0Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning Dec 27, 2023 Continual Learning graph construction
Code Code Available 0Dynamic Rectification Knowledge Distillation Jan 27, 2022 Edge-computing Knowledge Distillation
Code Code Available 0DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition Jul 16, 2025 Benchmarking Knowledge Distillation
Code Code Available 0Projected Latent Distillation for Data-Agnostic Consolidation in Distributed Continual Learning Mar 28, 2023 Continual Learning Knowledge Distillation
Code Code Available 0Leveraging Diffusion-Based Image Variations for Robust Training on Poisoned Data Oct 10, 2023 Knowledge Distillation
Code Code Available 0An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition Nov 16, 2021 Cross-Lingual NER Knowledge Distillation
Code Code Available 0Simple Semi-supervised Knowledge Distillation from Vision-Language Models via Dual-Head Optimization May 12, 2025 Few-Shot Image Classification Knowledge Distillation
Code Code Available 0A Comprehensive Overhaul of Feature Distillation Apr 3, 2019 General Classification image-classification
Code Code Available 0Leveraging Foundation Models via Knowledge Distillation in Multi-Object Tracking: Distilling DINOv2 Features to FairMOT Jul 25, 2024 Knowledge Distillation Multi-Object Tracking
Code Code Available 0Leveraging Knowledge Distillation for Efficient Deep Reinforcement Learning in Resource-Constrained Environments Oct 16, 2023 Decision Making Deep Reinforcement Learning
Code Code Available 0Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation Sep 29, 2023 Cross-Lingual Question Answering Cross-Lingual Transfer
Code Code Available 0Leveraging knowledge distillation for partial multi-task learning from multiple remote sensing datasets May 24, 2024 Knowledge Distillation Multi-Task Learning
Code Code Available 0Leveraging Large Language Models for Active Merchant Non-player Characters Dec 15, 2024 Knowledge Distillation
Code Code Available 0TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks Apr 23, 2019 Image Generation Knowledge Distillation
Code Code Available 0Continual Coarse-to-Fine Domain Adaptation in Semantic Segmentation Jan 18, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 0Leveraging Topological Guidance for Improved Knowledge Distillation Jul 7, 2024 image-classification Image Classification
Code Code Available 0Dual Correction Strategy for Ranking Distillation in Top-N Recommender System Sep 8, 2021 Knowledge Distillation Recommendation Systems
Code Code Available 0Knowledge Distillation of Russian Language Models with Reduction of Vocabulary May 4, 2022 Knowledge Distillation
Code Code Available 0Knowledge Distillation Layer that Lets the Student Decide Sep 6, 2023 Knowledge Distillation
Code Code Available 0DSMix: Distortion-Induced Sensitivity Map Based Pre-training for No-Reference Image Quality Assessment Jul 4, 2024 Data Augmentation Image Quality Assessment
Code Code Available 0DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models Sep 23, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0Better Supervisory Signals by Observing Learning Paths Mar 4, 2022 Knowledge Distillation
Code Code Available 0Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 0DS_FusionNet: Dynamic Dual-Stream Fusion with Bidirectional Knowledge Distillation for Plant Disease Recognition Apr 29, 2025 Fine-Grained Image Classification image-classification
Code Code Available 0DROP: Poison Dilution via Knowledge Distillation for Federated Learning Feb 10, 2025 Data Poisoning Federated Learning
Code Code Available 0Prototype-guided Cross-task Knowledge Distillation for Large-scale Models Dec 26, 2022 Knowledge Distillation
Code Code Available 0Do You Remember . . . the Future? Weak-to-Strong generalization in 3D Object Detection Aug 3, 2024 3D Object Detection Knowledge Distillation
Code Code Available 0Context Unaware Knowledge Distillation for Image Retrieval Jul 19, 2022 Image Retrieval Knowledge Distillation
Code Code Available 0Proxy-Anchor and EVT-Driven Continual Learning Method for Generalized Category Discovery Apr 11, 2025 Continual Learning Knowledge Distillation
Code Code Available 0BEiT v2: Masked Image Modeling with Vector-Quantized Visual Tokenizers Aug 12, 2022 image-classification Image Classification
Code Code Available 0Knowledge Distillation from Single to Multi Labels: an Empirical Study Mar 15, 2023 Classification image-classification
Code Code Available 0PrUE: Distilling Knowledge from Sparse Teacher Networks Jul 3, 2022 Knowledge Distillation
Code Code Available 0Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks Nov 1, 2021 Dialogue State Tracking Diversity
Code Code Available 0Few Sample Knowledge Distillation for Efficient Network Compression Dec 5, 2018 Knowledge Distillation Network Pruning
Code Code Available 0Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT Nov 11, 2022 Image Segmentation Knowledge Distillation
Code Code Available 0Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 0Light Multi-segment Activation for Model Compression Jul 16, 2019 Knowledge Distillation model
Code Code Available 0Lightning Fast Video Anomaly Detection via Adversarial Knowledge Distillation Nov 28, 2022 Anomaly Detection Knowledge Distillation
Code Code Available 0Domain Knowledge Transferring for Pre-trained Language Model via Calibrated Activation Boundary Distillation May 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 0LightPath: Lightweight and Scalable Path Representation Learning Jul 19, 2023 Knowledge Distillation Relational Reasoning
Code Code Available 0Domain Generalization for Crop Segmentation with Standardized Ensemble Knowledge Distillation Apr 3, 2023 Domain Generalization Knowledge Distillation
Code Code Available 0