Dynamic Activation with Knowledge Distillation for Energy-Efficient Spiking NN Ensembles Feb 19, 2025 Disentanglement Ensemble Learning
— Unverified 0Capturing Rich Behavior Representations: A Dynamic Action Semantic-Aware Graph Transformer for Video Captioning Feb 19, 2025 Knowledge Distillation Object
— Unverified 0MambaLiteSR: Image Super-Resolution with Low-Rank Mamba using Knowledge Distillation Feb 19, 2025 Image Super-Resolution Knowledge Distillation
— Unverified 0JL1-CD: A New Benchmark for Remote Sensing Change Detection and a Robust Multi-Teacher Knowledge Distillation Framework Feb 19, 2025 Change Detection Earth Observation
Code Code Available 2Enhancing Semi-supervised Learning with Zero-shot Pseudolabels Feb 18, 2025 Knowledge Distillation
— Unverified 0Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 0NaturalReasoning: Reasoning in the Wild with 2.8M Challenging Questions Feb 18, 2025 Knowledge Distillation Math
— Unverified 0Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models Feb 18, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Does Training with Synthetic Data Truly Protect Privacy? Feb 18, 2025 Data-free Knowledge Distillation Dataset Distillation
Code Code Available 0Warmup-Distill: Bridge the Distribution Mismatch between Teacher and Student before Knowledge Distillation Feb 17, 2025 Knowledge Distillation Math
Code Code Available 0Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 1Leave No One Behind: Enhancing Diversity While Maintaining Accuracy in Social Recommendation Feb 17, 2025 Diversity Knowledge Distillation
Code Code Available 0Enhancing Cross-Tokenizer Knowledge Distillation with Contextual Dynamical Mapping Feb 16, 2025 Code Generation Instruction Following
Code Code Available 1DA-Mamba: Domain Adaptive Hybrid Mamba-Transformer Based One-Stage Object Detection Feb 16, 2025 Domain Adaptation Knowledge Distillation
Code Code Available 1Leveraging Conditional Mutual Information to Improve Large Language Model Fine-Tuning For Classification Feb 16, 2025 Classification image-classification
— Unverified 0Smoothing Out Hallucinations: Mitigating LLM Hallucination with Smoothed Knowledge Distillation Feb 16, 2025 Hallucination Knowledge Distillation
— Unverified 0CLoCKDistill: Consistent Location-and-Context-aware Knowledge Distillation for DETRs Feb 15, 2025 Denoising Knowledge Distillation
— Unverified 0LLM-driven Knowledge Distillation for Dynamic Text-Attributed Graphs Feb 15, 2025 Edge Classification Knowledge Distillation
— Unverified 0AIDE: Agentically Improve Visual Language Model with Domain Experts Feb 13, 2025 Knowledge Distillation Language Modeling
— Unverified 0LLM Pretraining with Continuous Concepts Feb 12, 2025 Knowledge Distillation Language Modeling
— Unverified 0Vision-Language Models for Edge Networks: A Comprehensive Survey Feb 11, 2025 Autonomous Vehicles Image Captioning
— Unverified 0Optimizing Knowledge Distillation in Transformers: Enabling Multi-Head Attention without Alignment Barriers Feb 11, 2025 image-classification Image Classification
— Unverified 0Life-Code: Central Dogma Modeling with Multi-Omics Sequence Unification Feb 11, 2025 Knowledge Distillation
— Unverified 0OpenGrok: Enhancing SNS Data Processing with Distilled Knowledge and Mask-like Mechanisms Feb 11, 2025 Knowledge Distillation MMLU
Code Code Available 0Right Time to Learn:Promoting Generalization via Bio-inspired Spacing Effect in Knowledge Distillation Feb 10, 2025 Knowledge Distillation
Code Code Available 0Progressive Collaborative and Semantic Knowledge Fusion for Generative Recommendation Feb 10, 2025 Knowledge Distillation
— Unverified 0DROP: Poison Dilution via Knowledge Distillation for Federated Learning Feb 10, 2025 Data Poisoning Federated Learning
Code Code Available 0Rationalization Models for Text-to-SQL Feb 10, 2025 Knowledge Distillation Language Modeling
— Unverified 0Contrastive Representation Distillation via Multi-Scale Feature Decoupling Feb 9, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Audio-Visual Representation Learning via Knowledge Distillation from Speech Foundation Models Feb 9, 2025 Audio-Visual Speech Recognition Automatic Speech Recognition
Code Code Available 1Synergistic Effects of Knowledge Distillation and Structured Pruning for Self-Supervised Speech Models Feb 9, 2025 Knowledge Distillation Model Compression
— Unverified 0ATLAS: Autoformalizing Theorems through Lifting, Augmentation, and Synthesis of Data Feb 8, 2025 Knowledge Distillation
— Unverified 0Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector Feb 8, 2025 Incremental Learning Knowledge Distillation
— Unverified 0Event Stream-based Visual Object Tracking: HDETrack V2 and A High-Definition Benchmark Feb 8, 2025 Knowledge Distillation Object Tracking
Code Code Available 2Multilingual Non-Autoregressive Machine Translation without Knowledge Distillation Feb 6, 2025 Knowledge Distillation Machine Translation
Code Code Available 0BOLT: Bootstrap Long Chain-of-Thought in Language Models without Distillation Feb 6, 2025 In-Context Learning Knowledge Distillation
— Unverified 0Revisiting Intermediate-Layer Matching in Knowledge Distillation: Layer-Selection Strategy Doesn't Matter (Much) Feb 6, 2025 Knowledge Distillation
— Unverified 0Towards Unified Music Emotion Recognition across Dimensional and Categorical Models Feb 6, 2025 Emotion Recognition Knowledge Distillation
Code Code Available 1A Unified Knowledge-Distillation and Semi-Supervised Learning Framework to Improve Industrial Ads Delivery Systems Feb 5, 2025 Knowledge Distillation
— Unverified 0Training an LLM-as-a-Judge Model: Pipeline, Insights, and Practical Lessons Feb 5, 2025 Instruction Following Knowledge Distillation
— Unverified 0MIND: Modality-Informed Knowledge Distillation Framework for Multimodal Clinical Prediction Tasks Feb 3, 2025 Imputation Knowledge Distillation
— Unverified 0A Framework for Double-Blind Federated Adaptation of Foundation Models Feb 3, 2025 Federated Learning image-classification
— Unverified 0VLM-Assisted Continual learning for Visual Question Answering in Self-Driving Feb 2, 2025 Autonomous Driving Continual Learning
— Unverified 0A method for estimating forest carbon storage distribution density via artificial intelligence generated content model Feb 2, 2025 Knowledge Distillation
— Unverified 0FedHPD: Heterogeneous Federated Reinforcement Learning via Policy Distillation Feb 2, 2025 Knowledge Distillation reinforcement-learning
Code Code Available 0Role of Mixup in Topological Persistence Based Knowledge Distillation for Wearable Sensor Data Feb 2, 2025 Data Augmentation Knowledge Distillation
— Unverified 0Robust Knowledge Distillation in Federated Learning: Counteracting Backdoor Attacks Feb 1, 2025 Federated Learning Knowledge Distillation
Code Code Available 0Rethinking the Upsampling Layer in Hyperspectral Image Super Resolution Jan 30, 2025 Hyperspectral Image Super-Resolution Image Super-Resolution
— Unverified 0Mini-ResEmoteNet: Leveraging Knowledge Distillation for Human-Centered Design Jan 30, 2025 Emotion Recognition Facial Emotion Recognition
— Unverified 0RL-based Query Rewriting with Distilled LLM for online E-Commerce Systems Jan 29, 2025 Knowledge Distillation Natural Language Understanding
— Unverified 0