Intermediate Distillation: Data-Efficient Distillation from Black-Box LLMs for Information Retrieval Jun 18, 2024 Information Retrieval Knowledge Distillation
— Unverified 0STEVE Series: Step-by-Step Construction of Agent Systems in Minecraft Jun 17, 2024 Knowledge Distillation Language Modeling
— Unverified 0Mutual Learning for Finetuning Click-Through Rate Prediction Models Jun 17, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0NLDF: Neural Light Dynamic Fields for Efficient 3D Talking Head Generation Jun 17, 2024 Knowledge Distillation NeRF
— Unverified 0Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions Jun 16, 2024 Federated Learning Knowledge Distillation
— Unverified 0Self-Knowledge Distillation for Learning Ambiguity Jun 14, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation Jun 13, 2024 Knowledge Distillation Model Compression
— Unverified 0Contextual Distillation Model for Diversified Recommendation Jun 13, 2024 Diversity Knowledge Distillation
— Unverified 0DistilDoc: Knowledge Distillation for Visually-Rich Document Applications Jun 12, 2024 document-image-classification Document Image Classification
— Unverified 0Adaptive Teaching with Shared Classifier for Knowledge Distillation Jun 12, 2024 Knowledge Distillation
Code Code Available 0Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model Jun 12, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Unveiling Incomplete Modality Brain Tumor Segmentation: Leveraging Masked Predicted Auto-Encoder and Divergence Learning Jun 12, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Low-Complexity Acoustic Scene Classification Using Parallel Attention-Convolution Network Jun 12, 2024 Acoustic Scene Classification Data Augmentation
Code Code Available 0Self-Distillation Learning Based on Temporal-Spatial Consistency for Spiking Neural Networks Jun 12, 2024 Knowledge Distillation
— Unverified 0FastAST: Accelerating Audio Spectrogram Transformer via Token Merging and Cross-Model Knowledge Distillation Jun 11, 2024 Audio Classification Knowledge Distillation
Code Code Available 0TernaryLLM: Ternarized Large Language Model Jun 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0Teaching with Uncertainty: Unleashing the Potential of Knowledge Distillation in Object Detection Jun 11, 2024 Knowledge Distillation object-detection
— Unverified 0Weighted KL-Divergence for Document Ranking Model Refinement Jun 10, 2024 Contrastive Learning Document Ranking
— Unverified 0BS-PLCNet 2: Two-stage Band-split Packet Loss Concealment Network with Intra-model Knowledge Distillation Jun 10, 2024 Knowledge Distillation Packet Loss Concealment
— Unverified 0Online Policy Distillation with Decision-Attention Jun 8, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Teaching-Assistant-in-the-Loop: Improving Knowledge Distillation from Imperfect Teacher Models in Low-Budget Scenarios Jun 8, 2024 Knowledge Distillation
— Unverified 0Data-Free Generative Replay for Class-Incremental Learning on Imbalanced Data Jun 7, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 0To Distill or Not to Distill? On the Robustness of Robust Knowledge Distillation Jun 6, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Step Out and Seek Around: On Warm-Start Training with Incremental Data Jun 6, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Mutual Information Guided Backdoor Mitigation for Pre-trained Encoders Jun 5, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Tiny models from tiny data: Textual and null-text inversion for few-shot distillation Jun 5, 2024 Few-Shot Image Classification image-classification
Code Code Available 0Adversarial Moment-Matching Distillation of Large Language Models Jun 5, 2024 Imitation Learning Instruction Following
Code Code Available 0PLaD: Preference-based Large Language Model Distillation with Pseudo-Preference Pairs Jun 5, 2024 Knowledge Distillation Language Modeling
— Unverified 0RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models Jun 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Optimal Transport Guided Correlation Assignment for Multimodal Entity Linking Jun 4, 2024 Entity Linking Knowledge Distillation
Code Code Available 0DL-KDD: Dual-Light Knowledge Distillation for Action Recognition in the Dark Jun 4, 2024 Action Recognition Knowledge Distillation
— Unverified 0Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 0Toward Efficient Deep Spiking Neuron Networks:A Survey On Compression Jun 3, 2024 Knowledge Distillation Quantization
— Unverified 0Learning Background Prompts to Discover Implicit Knowledge for Open Vocabulary Object Detection Jun 1, 2024 Knowledge Distillation Object
— Unverified 0Robust Knowledge Distillation Based on Feature Variance Against Backdoored Teacher Model Jun 1, 2024 Knowledge Distillation Model Compression
Code Code Available 0Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning May 31, 2024 Action Recognition Contrastive Learning
Code Code Available 0Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling May 31, 2024 Denoising Image Generation
Code Code Available 0Multi-label Class Incremental Emotion Decoding with Augmented Emotional Semantics Learning May 31, 2024 class-incremental learning Class Incremental Learning
— Unverified 0WebUOT-1M: Advancing Deep Underwater Object Tracking with A Million-Scale Benchmark May 30, 2024 Knowledge Distillation Object Tracking
— Unverified 0GKT: A Novel Guidance-Based Knowledge Transfer Framework For Efficient Cloud-edge Collaboration LLM Deployment May 30, 2024 GSM8K Knowledge Distillation
Code Code Available 0Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach May 30, 2024 Activity Recognition Knowledge Distillation
— Unverified 0Relation Modeling and Distillation for Learning with Noisy Labels May 30, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Distribution Aligned Semantics Adaption for Lifelong Person Re-Identification May 30, 2024 Knowledge Distillation Person Re-Identification
Code Code Available 0Scalable Detection of Salient Entities in News Articles May 30, 2024 Articles Knowledge Distillation
— Unverified 0BLSP-KD: Bootstrapping Language-Speech Pre-training via Knowledge Distillation May 29, 2024 Instruction Following Knowledge Distillation
— Unverified 0Forward-Backward Knowledge Distillation for Continual Clustering May 29, 2024 Clustering Continual Learning
— Unverified 0