Densely Guided Knowledge Distillation using Multiple Teacher Assistants Sep 18, 2020 Knowledge Distillation Model Compression
Code Code Available 1Noisy Self-Knowledge Distillation for Text Summarization Sep 15, 2020 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1Transferring Knowledge Distillation for Multilingual Social Event Detection Aug 6, 2021 Cross-Lingual Word Embeddings Event Detection
Code Code Available 1DE-RRD: A Knowledge Distillation Framework for Recommender System Dec 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 1Transport-Hub-Aware Spatial-Temporal Adaptive Graph Transformer for Traffic Flow Prediction Oct 12, 2023 Incremental Learning Knowledge Distillation
Code Code Available 1TSCM: A Teacher-Student Model for Vision Place Recognition Using Cross-Metric Knowledge Distillation Apr 2, 2024 Knowledge Distillation Visual Place Recognition
Code Code Available 1Brittle Features May Help Anomaly Detection Apr 21, 2021 Anomaly Detection Knowledge Distillation
— Unverified 0Bring the Power of Diffusion Model to Defect Detection Aug 25, 2024 Defect Detection Denoising
— Unverified 0An Enhanced Low-Resolution Image Recognition Method for Traffic Environments Sep 28, 2023 Computational Efficiency Knowledge Distillation
— Unverified 0Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation May 19, 2025 Knowledge Distillation Prediction
— Unverified 0Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems Aug 26, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0An Empirical Study of Uniform-Architecture Knowledge Distillation in Document Ranking Feb 8, 2023 Document Ranking Knowledge Distillation
— Unverified 0Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation Nov 1, 2020 Decoder Dialogue Generation
— Unverified 0Bridging the Gap Between Patient-specific and Patient-independent Seizure Prediction via Knowledge Distillation Feb 25, 2022 Knowledge Distillation Prediction
— Unverified 0A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification Jan 15, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 0An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models Apr 19, 2023 Knowledge Distillation Machine Translation
— Unverified 0Supervised domain adaptation for building extraction from off-nadir aerial images Nov 7, 2023 Domain Adaptation Earth Observation
— Unverified 0Dual Discriminator Adversarial Distillation for Data-free Model Compression Apr 12, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Dual Embodied-Symbolic Concept Representations for Deep Learning Mar 1, 2022 class-incremental learning Class Incremental Learning
— Unverified 0An Empirical Study of Efficient ASR Rescoring with Transformers Oct 24, 2019 Knowledge Distillation Language Modeling
— Unverified 0Ground Reaction Force Estimation via Time-aware Knowledge Distillation Jun 12, 2025 Knowledge Distillation
— Unverified 0Bridging Fairness and Environmental Sustainability in Natural Language Processing Nov 8, 2022 Dimensionality Reduction Fairness
— Unverified 0An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation Jan 12, 2024 Knowledge Distillation
— Unverified 0Addressing Bias Through Ensemble Learning and Regularized Fine-Tuning Feb 1, 2024 Ensemble Learning Knowledge Distillation
— Unverified 0DS-ViT: Dual-Stream Vision Transformer for Cross-Task Distillation in Alzheimer's Early Diagnosis Sep 11, 2024 Classification Knowledge Distillation
— Unverified 0Direct Preference Knowledge Distillation for Large Language Models Jun 28, 2024 Knowledge Distillation
— Unverified 0Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation Nov 23, 2023 Dimensionality Reduction Image Classification
— Unverified 0An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation Jun 6, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Bridge the Gap between Past and Future: Siamese Model Optimization for Context-Aware Document Ranking May 20, 2025 Document Ranking Information Retrieval
— Unverified 0DiPair: Fast and Accurate Distillation for Trillion-Scale Text Matching and Pair Modeling Oct 7, 2020 Knowledge Distillation Question Answering
— Unverified 0An Efficient Private GPT Never Autoregressively Decodes May 21, 2025 Knowledge Distillation
— Unverified 0A Comparative Analysis of Task-Agnostic Distillation Methods for Compressing Transformer Language Models Oct 13, 2023 Knowledge Distillation
— Unverified 0DTCM: Deep Transformer Capsule Mutual Distillation for Multivariate Time Series Classification Feb 26, 2024 Knowledge Distillation Relation Network
— Unverified 0DILEMMA: Joint LLM Quantization and Distributed LLM Inference Over Edge Computing Systems Mar 3, 2025 Edge-computing Knowledge Distillation
— Unverified 0Breaking the trade-off in personalized speech enhancement with cross-task knowledge distillation Nov 5, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilation Sep 22, 2024 Image Generation Knowledge Distillation
— Unverified 0Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning Mar 10, 2023 Federated Learning Knowledge Distillation
— Unverified 0Breaking the Modality Barrier: Universal Embedding Learning with Multimodal LLMs Apr 24, 2025 Image-text Retrieval Instruction Following
— Unverified 0An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation Feb 28, 2020 Knowledge Distillation Memorization
— Unverified 0Add a SideNet to your MainNet Jul 14, 2020 General Classification Knowledge Distillation
— Unverified 0Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMs Feb 29, 2024 Dataset Generation Knowledge Distillation
— Unverified 0Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Digging Deeper into CRNN Model in Chinese Text Images Recognition Nov 17, 2020 Denoising Knowledge Distillation
— Unverified 0DS3-Net: Difficulty-perceived Common-to-T1ce Semi-Supervised Multimodal MRI Synthesis Network Mar 14, 2022 Knowledge Distillation SSIM
— Unverified 0DSFormer: Effective Compression of Text-Transformers by Dense-Sparse Weight Factorization Dec 20, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 0DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 0DiffusionTalker: Personalization and Acceleration for Speech-Driven 3D Face Diffuser Nov 28, 2023 3D Face Animation Contrastive Learning
— Unverified 0Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 0