Why Knowledge Distillation Works in Generative Models: A Minimal Working Explanation May 19, 2025 Knowledge Distillation Language Modeling
— Unverified 0SSR: Enhancing Depth Perception in Vision-Language Models via Rationale-Guided Spatial Reasoning May 18, 2025 Knowledge Distillation Spatial Reasoning
— Unverified 0LAMeTA: Intent-Aware Agentic Network Optimization via a Large AI Model-Empowered Two-Stage Approach May 18, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning May 17, 2025 Denoising image-classification
— Unverified 0On Membership Inference Attacks in Knowledge Distillation May 17, 2025 Knowledge Distillation Privacy Preserving
Code Code Available 0FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer May 17, 2025 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 0Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors May 16, 2025 Knowledge Distillation Multi-agent Reinforcement Learning
— Unverified 0Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation May 16, 2025 Knowledge Distillation
Code Code Available 0Semantically-Aware Game Image Quality Assessment May 16, 2025 Feature Importance Image Quality Assessment
— Unverified 0Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging May 15, 2025 Continual Learning Diagnostic
— Unverified 0DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 0Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer May 13, 2025 class-incremental learning Class Incremental Learning
— Unverified 0MoKD: Multi-Task Optimization for Knowledge Distillation May 13, 2025 image-classification Image Classification
— Unverified 0Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage May 13, 2025 Knowledge Distillation Large Language Model
— Unverified 0Simple Semi-supervised Knowledge Distillation from Vision-Language Models via Dual-Head Optimization May 12, 2025 Few-Shot Image Classification Knowledge Distillation
Code Code Available 0Ranking-aware Continual Learning for LiDAR Place Recognition May 12, 2025 Autonomous Driving Continual Learning
— Unverified 0Channel Fingerprint Construction for Massive MIMO: A Deep Conditional Generative Approach May 12, 2025 Denoising Knowledge Distillation
— Unverified 0Topology-Guided Knowledge Distillation for Efficient Point Cloud Processing May 12, 2025 3D Object Recognition Autonomous Driving
Code Code Available 0KDH-MLTC: Knowledge Distillation for Healthcare Multi-Label Text Classification May 12, 2025 Classification Hyperparameter Optimization
— Unverified 0An Extra RMSNorm is All You Need for Fine Tuning to 1.58 Bits May 12, 2025 All Knowledge Distillation
— Unverified 0Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models May 11, 2025 Knowledge Distillation
— Unverified 0Human in the Latent Loop (HILL): Interactively Guiding Model Training Through Human Intuition May 9, 2025 Knowledge Distillation
— Unverified 0Robust & Precise Knowledge Distillation-based Novel Context-Aware Predictor for Disease Detection in Brain and Gastrointestinal May 9, 2025 Disease Prediction Knowledge Distillation
— Unverified 0Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 0Biomed-DPT: Dual Modality Prompt Tuning for Biomedical Vision-Language Models May 8, 2025 Clinical Knowledge Diagnostic
Code Code Available 0Theoretical Guarantees for LT-TTD: A Unified Transformer-based Architecture for Two-Level Ranking Systems May 7, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0Image Recognition with Online Lightweight Vision Transformer: A Survey May 6, 2025 Knowledge Distillation Survey
Code Code Available 0Knowledge Distillation for Speech Denoising by Latent Representation Alignment with Cosine Distance May 6, 2025 Denoising Knowledge Distillation
— Unverified 0SepALM: Audio Language Models Are Error Correctors for Robust Speech Separation May 6, 2025 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Action Spotting and Precise Event Detection in Sports: Datasets, Methods, and Challenges May 6, 2025 Action Localization Action Spotting
— Unverified 0Artificial Behavior Intelligence: Technology, Challenges, and Future Directions May 6, 2025 Autonomous Driving Emotion Recognition
— Unverified 0End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 0AKD : Adversarial Knowledge Distillation For Large Language Models Alignment on Coding tasks May 5, 2025 Code Completion Code Generation
— Unverified 0FedSDAF: Leveraging Source Domain Awareness for Enhanced Federated Domain Generalization May 5, 2025 Domain Generalization Knowledge Distillation
Code Code Available 0Optimizing LLMs for Resource-Constrained Environments: A Survey of Model Compression Techniques May 5, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Segment Any RGB-Thermal Model with Language-aided Distillation May 4, 2025 Instance Segmentation Knowledge Distillation
— Unverified 0High-Fidelity Pseudo-label Generation by Large Language Models for Training Robust Radiology Report Classifiers May 3, 2025 Diagnostic Knowledge Distillation
— Unverified 0Toward Data-centric Directed Graph Learning: An Entropy-driven Approach May 2, 2025 Graph Learning Knowledge Distillation
— Unverified 0Llama-Nemotron: Efficient Reasoning Models May 2, 2025 Knowledge Distillation Neural Architecture Search
— Unverified 0Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading May 1, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Enhancing New-item Fairness in Dynamic Recommender Systems Apr 30, 2025 Fairness Knowledge Distillation
Code Code Available 0CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation Apr 30, 2025 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0How to Backdoor the Knowledge Distillation Apr 30, 2025 Knowledge Distillation
— Unverified 0The Estimation of Continual Causal Effect for Dataset Shifting Streams Apr 29, 2025 counterfactual Knowledge Distillation
— Unverified 0Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 0DS_FusionNet: Dynamic Dual-Stream Fusion with Bidirectional Knowledge Distillation for Plant Disease Recognition Apr 29, 2025 Fine-Grained Image Classification image-classification
Code Code Available 0Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition Apr 29, 2025 GSM8K Knowledge Distillation
— Unverified 0SAM-Guided Robust Representation Learning for One-Shot 3D Medical Image Segmentation Apr 29, 2025 General Knowledge Image Segmentation
— Unverified 0Federated One-Shot Learning with Data Privacy and Objective-Hiding Apr 29, 2025 Federated Learning Information Retrieval
— Unverified 0Knowledge Distillation of Domain-adapted LLMs for Question-Answering in Telecom Apr 28, 2025 Domain Adaptation Knowledge Distillation
— Unverified 0