Visual-Language Model Knowledge Distillation Method for Image Quality Assessment Jul 21, 2025 Image Quality Assessment Knowledge Distillation
— Unverified 0Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces Jul 17, 2025 EEG Knowledge Distillation
— Unverified 0DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition Jul 16, 2025 Benchmarking Knowledge Distillation
Code Code Available 0HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0SFedKD: Sequential Federated Learning with Discrepancy-Aware Multi-Teacher Knowledge Distillation Jul 11, 2025 Federated Learning Knowledge Distillation
— Unverified 0Towards Collaborative Fairness in Federated Learning Under Imbalanced Covariate Shift Jul 11, 2025 Collaborative Fairness Fairness
— Unverified 0KAT-V1: Kwai-AutoThink Technical Report Jul 11, 2025 Knowledge Distillation Large Language Model
— Unverified 0The Trilemma of Truth in Large Language Models Jun 30, 2025 Attribute Conformal Prediction
Code Code Available 0Layer Importance for Mathematical Reasoning is Forged in Pre-Training and Invariant after Post-Training Jun 27, 2025 Knowledge Distillation Mathematical Reasoning
— Unverified 0Distilling Normalizing Flows Jun 26, 2025 Density Estimation Knowledge Distillation
— Unverified 0G^2D: Boosting Multimodal Learning with Gradient-Guided Distillation Jun 26, 2025 Knowledge Distillation Model Optimization
Code Code Available 0Continual Self-Supervised Learning with Masked Autoencoders in Remote Sensing Jun 26, 2025 Continual Learning Continual Self-Supervised Learning
— Unverified 0Building Lightweight Semantic Segmentation Models for Aerial Images Using Dual Relation Distillation Jun 25, 2025 Knowledge Distillation Relation
— Unverified 0Towards Scalable and Generalizable Earth Observation Data Mining via Foundation Model Composition Jun 25, 2025 Earth Observation Knowledge Distillation
— Unverified 0FedBKD: Distilled Federated Learning to Embrace Gerneralization and Personalization on Non-IID Data Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation Jun 25, 2025 Federated Learning Knowledge Distillation
Code Code Available 0Client Clustering Meets Knowledge Sharing: Enhancing Privacy and Robustness in Personalized Peer-to-Peer Learning Jun 25, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Recalling The Forgotten Class Memberships: Unlearned Models Can Be Noisy Labelers to Leak Privacy Jun 24, 2025 Knowledge Distillation Learning with noisy labels
— Unverified 0Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 0GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 0PicoSAM2: Low-Latency Segmentation In-Sensor for Edge Vision Applications Jun 23, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0Efficient and Generalizable Speaker Diarization via Structured Pruning of Self-Supervised Models Jun 23, 2025 Domain Adaptation GPU
Code Code Available 3Multimodal Fusion SLAM with Fourier Attention Jun 22, 2025 Knowledge Distillation Optical Flow Estimation
Code Code Available 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Fine-grained Image Retrieval via Dual-Vision Adaptation Jun 19, 2025 Image Retrieval Knowledge Distillation
— Unverified 0Knowledge Distillation Framework for Accelerating High-Accuracy Neural Network-Based Molecular Dynamics Simulations Jun 18, 2025 Knowledge Distillation
— Unverified 0Factorized RVQ-GAN For Disentangled Speech Tokenization Jun 18, 2025 Disentanglement Knowledge Distillation
— Unverified 0AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes Jun 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Model compression using knowledge distillation with integrated gradients Jun 17, 2025 Data Augmentation Knowledge Distillation
— Unverified 0KDMOS:Knowledge Distillation for Motion Segmentation Jun 17, 2025 Autonomous Driving Knowledge Distillation
Code Code Available 0Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models Jun 16, 2025 Knowledge Distillation Semantic Communication
— Unverified 0SeqPE: Transformer with Sequential Position Encoding Jun 16, 2025 image-classification Image Classification
Code Code Available 1HKD4VLM: A Progressive Hybrid Knowledge Distillation Framework for Robust Multimodal Hallucination and Factuality Detection in VLMs Jun 16, 2025 Hallucination Knowledge Distillation
— Unverified 0A Technical Study into Small Reasoning Language Models Jun 16, 2025 Code Generation Computational Efficiency
— Unverified 0Ground Reaction Force Estimation via Time-aware Knowledge Distillation Jun 12, 2025 Knowledge Distillation
— Unverified 0A Novel Lightweight Transformer with Edge-Aware Fusion for Remote Sensing Image Captioning Jun 11, 2025 Decoder Image Captioning
— Unverified 0Multi-Teacher Language-Aware Knowledge Distillation for Multilingual Speech Emotion Recognition Jun 10, 2025 Emotion Recognition Knowledge Distillation
Code Code Available 0SwS: Self-aware Weakness-driven Problem Synthesis in Reinforcement Learning for LLM Reasoning Jun 10, 2025 Knowledge Distillation Math
Code Code Available 1Towards Class-wise Fair Adversarial Training via Anti-Bias Soft Label Distillation Jun 10, 2025 Adversarial Robustness Fairness
Code Code Available 0Label-Context-Dependent Internal Language Model Estimation for CTC Jun 6, 2025 Knowledge Distillation Language Modeling
— Unverified 0Being Strong Progressively! Enhancing Knowledge Distillation of Large Language Models through a Curriculum Learning Framework Jun 6, 2025 Instruction Following Knowledge Distillation
Code Code Available 0StatsMerging: Statistics-Guided Model Merging via Task-Specific Teacher Distillation Jun 5, 2025 Knowledge Distillation
Code Code Available 0Static Word Embeddings for Sentence Semantic Representation Jun 5, 2025 Contrastive Learning Knowledge Distillation
— Unverified 0hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 0Debate, Reflect, and Distill: Multi-Agent Feedback with Tree-Structured Preference Optimization for Efficient Language Model Enhancement Jun 4, 2025 Knowledge Distillation Language Modeling
— Unverified 0QA-HFL: Quality-Aware Hierarchical Federated Learning for Resource-Constrained Mobile Devices with Heterogeneous Image Quality Jun 4, 2025 Federated Learning Knowledge Distillation
— Unverified 0Building a Few-Shot Cross-Domain Multilingual NLU Model for Customer Care Jun 4, 2025 Intent Detection Knowledge Distillation
— Unverified 0TalkingMachines: Real-Time Audio-Driven FaceTime-Style Video via Autoregressive Diffusion Models Jun 3, 2025 Decoder Knowledge Distillation
— Unverified 0KDRL: Post-Training Reasoning LLMs via Unified Knowledge Distillation and Reinforcement Learning Jun 2, 2025 Knowledge Distillation Large Language Model
— Unverified 0