BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation Jun 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 1Simplified TinyBERT: Knowledge Distillation for Document Retrieval Sep 16, 2020 Document Ranking Knowledge Distillation
Code Code Available 1C2KD: Cross-Lingual Cross-Modal Knowledge Distillation for Multilingual Text-Video Retrieval Oct 7, 2022 Knowledge Distillation Retrieval
Code Code Available 1SLMRec: Distilling Large Language Models into Small for Sequential Recommendation May 28, 2024 Knowledge Distillation Language Modeling
Code Code Available 1A New Knowledge Distillation Network for Incremental Few-Shot Surface Defect Detection Sep 1, 2022 Defect Detection Knowledge Distillation
Code Code Available 1Decoupled Multimodal Distilling for Emotion Recognition Mar 24, 2023 Emotion Recognition Knowledge Distillation
Code Code Available 1Does Knowledge Distillation Really Work? Jun 10, 2021 Knowledge Distillation
Code Code Available 1A Neural Span-Based Continual Named Entity Recognition Model Feb 23, 2023 Continual Learning Continual Named Entity Recognition
Code Code Available 1Domain Generalization for Prostate Segmentation in Transrectal Ultrasound Images: A Multi-center Study Sep 5, 2022 Domain Adaptation Domain Generalization
Code Code Available 1DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models May 28, 2023 Knowledge Distillation Self-Supervised Learning
Code Code Available 1DM-VTON: Distilled Mobile Real-time Virtual Try-On Aug 26, 2023 GPU Human Parsing
Code Code Available 1DeepAqua: Self-Supervised Semantic Segmentation of Wetland Surface Water Extent with SAR Images using Knowledge Distillation May 2, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 1Hybrid Inverted Index Is a Robust Accelerator for Dense Retrieval Oct 11, 2022 Knowledge Distillation Quantization
Code Code Available 1SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation Aug 21, 2023 Knowledge Distillation Language Modelling
Code Code Available 1DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture Sep 5, 2024 Data-free Knowledge Distillation Denoising
Code Code Available 1DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval Jun 24, 2021 Computational Efficiency Knowledge Distillation
Code Code Available 1Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised Semantic Hashing Mar 10, 2024 Image Retrieval Knowledge Distillation
Code Code Available 1Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation Jun 18, 2020 Decoder Knowledge Distillation
Code Code Available 1AMFD: Distillation via Adaptive Multimodal Fusion for Multispectral Pedestrian Detection May 21, 2024 Knowledge Distillation Pedestrian Detection
Code Code Available 1Structured Sparse R-CNN for Direct Scene Graph Generation Jun 21, 2021 graph construction Graph Generation
Code Code Available 1Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation Dec 19, 2021 Anomaly Detection Knowledge Distillation
Code Code Available 1DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer May 21, 2025 Denoising Knowledge Distillation
Code Code Available 1Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 1SUOD: Toward Scalable Unsupervised Outlier Detection Feb 8, 2020 Knowledge Distillation Outlier Detection
Code Code Available 1Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning Jun 11, 2023 Knowledge Distillation Meta-Learning
Code Code Available 1Supervised Compression for Resource-Constrained Edge Computing Systems Aug 21, 2021 Data Compression Edge-computing
Code Code Available 1Black-box Few-shot Knowledge Distillation Jul 25, 2022 image-classification Image Classification
Code Code Available 1Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1DIOD: Self-Distillation Meets Object Discovery Jan 1, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 1Tailoring Instructions to Student's Learning Levels Boosts Knowledge Distillation May 16, 2023 Knowledge Distillation text-classification
Code Code Available 1Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Deep Semi-supervised Knowledge Distillation for Overlapping Cervical Cell Instance Segmentation Jul 21, 2020 Instance Segmentation Knowledge Distillation
Code Code Available 1Teachers Do More Than Teach: Compressing Image-to-Image Models Mar 5, 2021 Knowledge Distillation
Code Code Available 1Deep Structured Instance Graph for Distilling Object Detectors Sep 27, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation Mar 21, 2022 Document-level Relation Extraction Knowledge Distillation
Code Code Available 1Distill the Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation Oct 10, 2022 Knowledge Distillation Machine Translation
Code Code Available 1Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection Jul 13, 2020 image-classification Image Classification
Code Code Available 1Defocus Blur Detection via Depth Distillation Jul 16, 2020 Decoder Defocus Blur Detection
Code Code Available 1Deformation Flow Based Two-Stream Network for Lip Reading Mar 12, 2020 Knowledge Distillation Lipreading
Code Code Available 1The Modality Focusing Hypothesis: Towards Understanding Crossmodal Knowledge Distillation Jun 13, 2022 Knowledge Distillation Transfer Learning
Code Code Available 1Deliberated Domain Bridging for Domain Adaptive Semantic Segmentation Sep 16, 2022 Domain Adaptation Image-to-Image Translation
Code Code Available 1Deliberation on Priors: Trustworthy Reasoning of Large Language Models on Knowledge Graphs May 21, 2025 Knowledge Distillation Knowledge Graphs
Code Code Available 1DistilPose: Tokenized Pose Regression with Heatmap Distillation Mar 4, 2023 Knowledge Distillation Pose Estimation
Code Code Available 1DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport Jul 21, 2023 Denoising Knowledge Distillation
Code Code Available 1EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic Data Sep 11, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1End-to-End Zero-Shot HOI Detection via Vision and Language Knowledge Distillation Apr 1, 2022 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 1FerKD: Surgical Label Adaptation for Efficient Distillation Dec 29, 2023 Knowledge Distillation
Code Code Available 1Dense Interspecies Face Embedding Nov 28, 2022 Image Manipulation Interspecies Facial Keypoint Transfer
Code Code Available 1Instance-Conditional Knowledge Distillation for Object Detection Oct 25, 2021 Image Classification Knowledge Distillation
Code Code Available 1