Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection Sep 16, 2018 Classification General Classification
Code Code Available 1Channel Gating Neural Networks May 29, 2018 Knowledge Distillation Network Pruning
Code Code Available 1Grad-CAM++: Improved Visual Explanations for Deep Convolutional Networks Oct 30, 2017 3D Action Recognition Action Recognition
Code Code Available 1Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer Dec 12, 2016 Knowledge Distillation
Code Code Available 1Sequence-Level Knowledge Distillation Jun 25, 2016 Knowledge Distillation Machine Translation
Code Code Available 1Distilling the Knowledge in a Neural Network Mar 9, 2015 Knowledge Distillation Mixture-of-Experts
Code Code Available 1FitNets: Hints for Thin Deep Nets Dec 19, 2014 Knowledge Distillation
Code Code Available 1Visual-Language Model Knowledge Distillation Method for Image Quality Assessment Jul 21, 2025 Image Quality Assessment Knowledge Distillation
— Unverified 0Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces Jul 17, 2025 EEG Knowledge Distillation
— Unverified 0DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition Jul 16, 2025 Benchmarking Knowledge Distillation
Code Code Available 0HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0KAT-V1: Kwai-AutoThink Technical Report Jul 11, 2025 Knowledge Distillation Large Language Model
— Unverified 0SFedKD: Sequential Federated Learning with Discrepancy-Aware Multi-Teacher Knowledge Distillation Jul 11, 2025 Federated Learning Knowledge Distillation
— Unverified 0Towards Collaborative Fairness in Federated Learning Under Imbalanced Covariate Shift Jul 11, 2025 Collaborative Fairness Fairness
— Unverified 0The Trilemma of Truth in Large Language Models Jun 30, 2025 Attribute Conformal Prediction
Code Code Available 0Layer Importance for Mathematical Reasoning is Forged in Pre-Training and Invariant after Post-Training Jun 27, 2025 Knowledge Distillation Mathematical Reasoning
— Unverified 0Continual Self-Supervised Learning with Masked Autoencoders in Remote Sensing Jun 26, 2025 Continual Learning Continual Self-Supervised Learning
— Unverified 0Distilling Normalizing Flows Jun 26, 2025 Density Estimation Knowledge Distillation
— Unverified 0G^2D: Boosting Multimodal Learning with Gradient-Guided Distillation Jun 26, 2025 Knowledge Distillation Model Optimization
Code Code Available 0Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0FedBKD: Distilled Federated Learning to Embrace Gerneralization and Personalization on Non-IID Data Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0Towards Scalable and Generalizable Earth Observation Data Mining via Foundation Model Composition Jun 25, 2025 Earth Observation Knowledge Distillation
— Unverified 0Client Clustering Meets Knowledge Sharing: Enhancing Privacy and Robustness in Personalized Peer-to-Peer Learning Jun 25, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Building Lightweight Semantic Segmentation Models for Aerial Images Using Dual Relation Distillation Jun 25, 2025 Knowledge Distillation Relation
— Unverified 0Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 0Recalling The Forgotten Class Memberships: Unlearned Models Can Be Noisy Labelers to Leak Privacy Jun 24, 2025 Knowledge Distillation Learning with noisy labels
— Unverified 0GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 0PicoSAM2: Low-Latency Segmentation In-Sensor for Edge Vision Applications Jun 23, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0Multimodal Fusion SLAM with Fourier Attention Jun 22, 2025 Knowledge Distillation Optical Flow Estimation
Code Code Available 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Fine-grained Image Retrieval via Dual-Vision Adaptation Jun 19, 2025 Image Retrieval Knowledge Distillation
— Unverified 0Knowledge Distillation Framework for Accelerating High-Accuracy Neural Network-Based Molecular Dynamics Simulations Jun 18, 2025 Knowledge Distillation
— Unverified 0Factorized RVQ-GAN For Disentangled Speech Tokenization Jun 18, 2025 Disentanglement Knowledge Distillation
— Unverified 0KDMOS:Knowledge Distillation for Motion Segmentation Jun 17, 2025 Autonomous Driving Knowledge Distillation
Code Code Available 0Model compression using knowledge distillation with integrated gradients Jun 17, 2025 Data Augmentation Knowledge Distillation
— Unverified 0AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes Jun 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models Jun 16, 2025 Knowledge Distillation Semantic Communication
— Unverified 0A Technical Study into Small Reasoning Language Models Jun 16, 2025 Code Generation Computational Efficiency
— Unverified 0HKD4VLM: A Progressive Hybrid Knowledge Distillation Framework for Robust Multimodal Hallucination and Factuality Detection in VLMs Jun 16, 2025 Hallucination Knowledge Distillation
— Unverified 0Ground Reaction Force Estimation via Time-aware Knowledge Distillation Jun 12, 2025 Knowledge Distillation
— Unverified 0A Novel Lightweight Transformer with Edge-Aware Fusion for Remote Sensing Image Captioning Jun 11, 2025 Decoder Image Captioning
— Unverified 0Multi-Teacher Language-Aware Knowledge Distillation for Multilingual Speech Emotion Recognition Jun 10, 2025 Emotion Recognition Knowledge Distillation
Code Code Available 0Towards Class-wise Fair Adversarial Training via Anti-Bias Soft Label Distillation Jun 10, 2025 Adversarial Robustness Fairness
Code Code Available 0Being Strong Progressively! Enhancing Knowledge Distillation of Large Language Models through a Curriculum Learning Framework Jun 6, 2025 Instruction Following Knowledge Distillation
Code Code Available 0Label-Context-Dependent Internal Language Model Estimation for CTC Jun 6, 2025 Knowledge Distillation Language Modeling
— Unverified 0hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 0Static Word Embeddings for Sentence Semantic Representation Jun 5, 2025 Contrastive Learning Knowledge Distillation
— Unverified 0StatsMerging: Statistics-Guided Model Merging via Task-Specific Teacher Distillation Jun 5, 2025 Knowledge Distillation
Code Code Available 0Debate, Reflect, and Distill: Multi-Agent Feedback with Tree-Structured Preference Optimization for Efficient Language Model Enhancement Jun 4, 2025 Knowledge Distillation Language Modeling
— Unverified 0