Distillation Learning Guided by Image Reconstruction for One-Shot Medical Image Segmentation Aug 7, 2024 Data Augmentation Image Reconstruction
Code Code Available 05 Curriculum-scheduled Knowledge Distillation from Multiple Pre-trained Teachers for Multi-domain Sequential Recommendation Jan 1, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 05 Distillation Improves Visual Place Recognition for Low Quality Images Oct 10, 2023 Knowledge Distillation Quantization
Code Code Available 05 GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 05 DASK: Distribution Rehearsing via Adaptive Style Kernel Learning for Exemplar-Free Lifelong Person Re-Identification Dec 12, 2024 Exemplar-Free Knowledge Distillation
Code Code Available 05 Teacher Agent: A Knowledge Distillation-Free Framework for Rehearsal-based Video Incremental Learning Jun 1, 2023 Incremental Learning Knowledge Distillation
Code Code Available 05 Improved Knowledge Distillation for Crowd Counting on IoT Device Aug 2, 2023 Crowd Counting Knowledge Distillation
Code Code Available 05 m2mKD: Module-to-Module Knowledge Distillation for Modular Transformers Feb 26, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 05 StableMamba: Distillation-free Scaling of Large SSMs for Images and Videos Sep 18, 2024 Action Recognition image-classification
— Unverified 00 Brittle Features May Help Anomaly Detection Apr 21, 2021 Anomaly Detection Knowledge Distillation
— Unverified 00 Distillation-Enhanced Physical Adversarial Attacks Jan 4, 2025 Adversarial Attack Knowledge Distillation
— Unverified 00 Bring the Power of Diffusion Model to Defect Detection Aug 25, 2024 Defect Detection Denoising
— Unverified 00 An Enhanced Low-Resolution Image Recognition Method for Traffic Environments Sep 28, 2023 Computational Efficiency Knowledge Distillation
— Unverified 00 Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 00 Knowledge Distillation Decision Tree for Unravelling Black-box Machine Learning Models Jun 9, 2022 Knowledge Distillation
— Unverified 00 Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation May 19, 2025 Knowledge Distillation Prediction
— Unverified 00 Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems Aug 26, 2024 Knowledge Distillation Recommendation Systems
— Unverified 00 An Empirical Study of Uniform-Architecture Knowledge Distillation in Document Ranking Feb 8, 2023 Document Ranking Knowledge Distillation
— Unverified 00 Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation Nov 1, 2020 Decoder Dialogue Generation
— Unverified 00 Distill and De-bias: Mitigating Bias in Face Verification using Knowledge Distillation Dec 17, 2021 Attribute Face Recognition
— Unverified 00 Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 Bridging the Gap Between Patient-specific and Patient-independent Seizure Prediction via Knowledge Distillation Feb 25, 2022 Knowledge Distillation Prediction
— Unverified 00 DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning Sep 13, 2020 Graph Embedding Knowledge Distillation
— Unverified 00 Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 00 DistilDoc: Knowledge Distillation for Visually-Rich Document Applications Jun 12, 2024 document-image-classification Document Image Classification
— Unverified 00 An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models Apr 19, 2023 Knowledge Distillation Machine Translation
— Unverified 00 A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification Jan 15, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Supervised domain adaptation for building extraction from off-nadir aerial images Nov 7, 2023 Domain Adaptation Earth Observation
— Unverified 00 Disentanglement, Visualization and Analysis of Complex Features in DNNs Jan 1, 2021 Disentanglement Knowledge Distillation
— Unverified 00 An Empirical Study of Efficient ASR Rescoring with Transformers Oct 24, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Bridging Fairness and Environmental Sustainability in Natural Language Processing Nov 8, 2022 Dimensionality Reduction Fairness
— Unverified 00 An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation Jan 12, 2024 Knowledge Distillation
— Unverified 00 Addressing Bias Through Ensemble Learning and Regularized Fine-Tuning Feb 1, 2024 Ensemble Learning Knowledge Distillation
— Unverified 00 DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 00 Direct Preference Knowledge Distillation for Large Language Models Jun 28, 2024 Knowledge Distillation
— Unverified 00 Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation Nov 23, 2023 Dimensionality Reduction Image Classification
— Unverified 00 An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation Jun 6, 2020 Data Augmentation Knowledge Distillation
— Unverified 00 Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 00 Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMs Feb 29, 2024 Dataset Generation Knowledge Distillation
— Unverified 00 DiPair: Fast and Accurate Distillation for Trillion-Scale Text Matching and Pair Modeling Oct 7, 2020 Knowledge Distillation Question Answering
— Unverified 00 Bridge the Gap between Past and Future: Siamese Model Optimization for Context-Aware Document Ranking May 20, 2025 Document Ranking Information Retrieval
— Unverified 00 An Efficient Private GPT Never Autoregressively Decodes May 21, 2025 Knowledge Distillation
— Unverified 00 A Comparative Analysis of Task-Agnostic Distillation Methods for Compressing Transformer Language Models Oct 13, 2023 Knowledge Distillation
— Unverified 00 Ground Reaction Force Estimation via Time-aware Knowledge Distillation Jun 12, 2025 Knowledge Distillation
— Unverified 00 DILEMMA: Joint LLM Quantization and Distributed LLM Inference Over Edge Computing Systems Mar 3, 2025 Edge-computing Knowledge Distillation
— Unverified 00 Breaking the trade-off in personalized speech enhancement with cross-task knowledge distillation Nov 5, 2022 Knowledge Distillation Speech Enhancement
— Unverified 00 DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilation Sep 22, 2024 Image Generation Knowledge Distillation
— Unverified 00 Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning Mar 10, 2023 Federated Learning Knowledge Distillation
— Unverified 00 Breaking the Modality Barrier: Universal Embedding Learning with Multimodal LLMs Apr 24, 2025 Image-text Retrieval Instruction Following
— Unverified 00 Digging Deeper into CRNN Model in Chinese Text Images Recognition Nov 17, 2020 Denoising Knowledge Distillation
— Unverified 00