Analyzing the Importance of Blank for CTC-Based Knowledge Distillation Jun 2, 2025 Automatic Speech Recognition Knowledge Distillation
— Unverified 0Feature Fusion and Knowledge-Distilled Multi-Modal Multi-Target Detection May 31, 2025 Domain Adaptation Knowledge Distillation
— Unverified 0Fine-tune Before Structured Pruning: Towards Compact and Accurate Self-Supervised Models for Speaker Diarization May 30, 2025 GPU Knowledge Distillation
— Unverified 0Revisiting Cross-Modal Knowledge Distillation: A Disentanglement Approach for RGBD Semantic Segmentation May 30, 2025 Autonomous Driving Contrastive Learning
Code Code Available 0Progressive Class-level Distillation May 30, 2025 Benchmarking Knowledge Distillation
— Unverified 0CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning May 30, 2025 class-incremental learning Class Incremental Learning
Code Code Available 1Proactive Guidance of Multi-Turn Conversation in Industrial Search May 30, 2025 Knowledge Distillation reinforcement-learning
— Unverified 0CREFT: Sequential Multi-Agent LLM for Character Relation Extraction May 30, 2025 Knowledge Distillation Language Modeling
— Unverified 0A Simple Linear Patch Revives Layer-Pruned Large Language Models May 30, 2025 Knowledge Distillation Question Answering
— Unverified 0Sketch Down the FLOPs: Towards Efficient Networks for Human Sketch May 29, 2025 Image Retrieval Knowledge Distillation
— Unverified 0Knowledge Distillation for Reservoir-based Classifier: Human Activity Recognition May 29, 2025 Activity Recognition Edge-computing
— Unverified 0CAST: Contrastive Adaptation and Distillation for Semi-Supervised Instance Segmentation May 28, 2025 Domain Adaptation Instance Segmentation
— Unverified 0Improving Respiratory Sound Classification with Architecture-Agnostic Knowledge Distillation from Ensembles May 28, 2025 Knowledge Distillation Sound Classification
Code Code Available 0Multi-MLLM Knowledge Distillation for Out-of-Context News Detection May 28, 2025 Knowledge Distillation Misinformation
— Unverified 0EasyDistill: A Comprehensive Toolkit for Effective Knowledge Distillation of Large Language Models May 27, 2025 Knowledge Distillation
— Unverified 0Light distillation for Incremental Graph Convolution Collaborative Filtering May 26, 2025 Collaborative Filtering Knowledge Distillation
— Unverified 0Model Stitching by Functional Latent Alignment May 26, 2025 Knowledge Distillation model
— Unverified 0Efficient Speech Translation through Model Compression and Knowledge Distillation May 26, 2025 Knowledge Distillation Model Compression
Code Code Available 0From Data to Modeling: Fully Open-vocabulary Scene Graph Generation May 26, 2025 Graph Generation Knowledge Distillation
— Unverified 0DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation May 26, 2025 Knowledge Distillation
Code Code Available 0ESLM: Risk-Averse Selective Language Modeling for Efficient Pretraining May 26, 2025 Knowledge Distillation Language Modeling
— Unverified 0Optimizing edge AI models on HPC systems with the edge in the loop May 26, 2025 Hardware Aware Neural Architecture Search Knowledge Distillation
Code Code Available 0Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments May 26, 2025 Data-free Knowledge Distillation Federated Learning
Code Code Available 0Online Knowledge Distillation with Reward Guidance May 25, 2025 Imitation Learning Knowledge Distillation
— Unverified 0Remote Sensing Image Classification with Decoupled Knowledge Distillation May 25, 2025 Classification image-classification
— Unverified 0Holistic White-light Polyp Classification via Alignment-free Dense Distillation of Auxiliary Optical Chromoendoscopy May 25, 2025 Diagnostic Knowledge Distillation
Code Code Available 0Tokenizing Electron Cloud in Protein-Ligand Interaction Learning May 25, 2025 Knowledge Distillation Prediction
— Unverified 0Knowledge Grafting of Large Language Models May 24, 2025 Continual Learning Knowledge Distillation
Code Code Available 0C3R: Channel Conditioned Cell Representations for unified evaluation in microscopy imaging May 24, 2025 Knowledge Distillation
— Unverified 0Single Snapshot Distillation for Phase Coded Mask Design in Phase Retrieval May 23, 2025 global-optimization Knowledge Distillation
— Unverified 0ToDi: Token-wise Distillation via Fine-Grained Divergence Control May 22, 2025 Instruction Following Knowledge Distillation
— Unverified 0On Multilingual Encoder Language Model Compression for Low-Resource Languages May 22, 2025 Knowledge Distillation Language Modeling
— Unverified 0SEDD-PCC: A Single Encoder-Dual Decoder Framework For End-To-End Learned Point Cloud Compression May 22, 2025 Attribute Decoder
— Unverified 0MentalMAC: Enhancing Large Language Models for Detecting Mental Manipulation via Multi-Task Anti-Curriculum Distillation May 21, 2025 Knowledge Distillation
— Unverified 0Deliberation on Priors: Trustworthy Reasoning of Large Language Models on Knowledge Graphs May 21, 2025 Knowledge Distillation Knowledge Graphs
Code Code Available 1On the Generalization vs Fidelity Paradox in Knowledge Distillation May 21, 2025 Knowledge Distillation Transfer Learning
Code Code Available 0An Efficient Private GPT Never Autoregressively Decodes May 21, 2025 Knowledge Distillation
— Unverified 0DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer May 21, 2025 Denoising Knowledge Distillation
Code Code Available 1UWSAM: Segment Anything Model Guided Underwater Instance Segmentation and A Large-scale Benchmark Dataset May 21, 2025 Instance Segmentation Knowledge Distillation
Code Code Available 1Bridge the Gap between Past and Future: Siamese Model Optimization for Context-Aware Document Ranking May 20, 2025 Document Ranking Information Retrieval
— Unverified 0Intra-class Patch Swap for Self-Distillation May 20, 2025 image-classification Image Classification
Code Code Available 0Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation May 20, 2025 Information Retrieval Knowledge Distillation
— Unverified 0Ground-V: Teaching VLMs to Ground Complex Instructions in Pixels May 20, 2025 Instruction Following Knowledge Distillation
— Unverified 0Improved Methods for Model Pruning and Knowledge Distillation May 20, 2025 Knowledge Distillation
— Unverified 0Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation May 19, 2025 Knowledge Distillation Prediction
— Unverified 0SMOTExT: SMOTE meets Large Language Models May 19, 2025 Cross-Modal Retrieval Data Augmentation
Code Code Available 0A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone May 19, 2025 Knowledge Distillation Transfer Learning
Code Code Available 1Towards Low-Latency Event Stream-based Visual Object Tracking: A Slow-Fast Approach May 19, 2025 Knowledge Distillation Object Tracking
Code Code Available 0Robust Multimodal Segmentation with Representation Regularization and Hybrid Prototype Distillation May 19, 2025 Knowledge Distillation Semantic Segmentation
Code Code Available 0ORQA: A Benchmark and Foundation Model for Holistic Operating Room Modeling May 19, 2025 Graph Generation Knowledge Distillation
— Unverified 0