Dialogue Chain-of-Thought Distillation for Commonsense-aware Conversational Agents Oct 13, 2023 Informativeness Knowledge Distillation
Code Code Available 1Focal and Global Knowledge Distillation for Detectors Nov 23, 2021 image-classification Image Classification
Code Code Available 1Directed Acyclic Transformer for Non-Autoregressive Machine Translation May 16, 2022 Knowledge Distillation Machine Translation
Code Code Available 1FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition May 27, 2023 image-classification Image Classification
Code Code Available 1Deliberation on Priors: Trustworthy Reasoning of Large Language Models on Knowledge Graphs May 21, 2025 Knowledge Distillation Knowledge Graphs
Code Code Available 1Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 1Prototype-based Incremental Few-Shot Semantic Segmentation Nov 30, 2020 Few-Shot Semantic Segmentation Incremental Learning
Code Code Available 1General Cyclical Training of Neural Networks Feb 17, 2022 Data Augmentation Knowledge Distillation
Code Code Available 1Dense Interspecies Face Embedding Nov 28, 2022 Image Manipulation Interspecies Facial Keypoint Transfer
Code Code Available 1COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using Transformers Sep 3, 2023 Action Detection Action Spotting
Code Code Available 1Generative Bias for Robust Visual Question Answering Aug 1, 2022 Knowledge Distillation Question Answering
Code Code Available 1Communication-Efficient Federated Learning through Adaptive Weight Clustering and Server-Side Distillation Jan 25, 2024 Clustering Federated Learning
Code Code Available 1Geometer: Graph Few-Shot Class-Incremental Learning via Prototype Representation May 27, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks Oct 24, 2022 Knowledge Distillation Transfer Learning
Code Code Available 1GlobalFlowNet: Video Stabilization using Deep Distilled Global Motion Estimates Oct 25, 2022 Knowledge Distillation Optical Flow Estimation
Code Code Available 1Global Knowledge Calibration for Fast Open-Vocabulary Segmentation Mar 16, 2023 Knowledge Distillation Open Vocabulary Semantic Segmentation
Code Code Available 1A Discrepancy Aware Framework for Robust Anomaly Detection Oct 11, 2023 Anomaly Detection Decoder
Code Code Available 1Good Teachers Explain: Explanation-Enhanced Knowledge Distillation Feb 5, 2024 Knowledge Distillation
Code Code Available 1Gradient-based Intra-attention Pruning on Pre-trained Language Models Dec 15, 2022 Knowledge Distillation
Code Code Available 1Graph-based Knowledge Distillation: A survey and experimental evaluation Feb 27, 2023 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1A framework for benchmarking class-out-of-distribution detection and its application to ImageNet Feb 23, 2023 Benchmarking Knowledge Distillation
Code Code Available 1Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation May 19, 2021 Image Classification Knowledge Distillation
Code Code Available 1A Symmetric Dual Encoding Dense Retrieval Framework for Knowledge-Intensive Visual Question Answering Apr 26, 2023 Decoder Knowledge Distillation
Code Code Available 1Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 1Complementary Relation Contrastive Distillation Mar 29, 2021 Knowledge Distillation Relation
Code Code Available 1Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge Jul 28, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Deliberated Domain Bridging for Domain Adaptive Semantic Segmentation Sep 16, 2022 Domain Adaptation Image-to-Image Translation
Code Code Available 1HAD-Net: A Hierarchical Adversarial Knowledge Distillation Network for Improved Enhanced Tumour Segmentation Without Post-Contrast Images Mar 30, 2021 Knowledge Distillation Segmentation
Code Code Available 1Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Heterogeneous Knowledge Distillation using Information Flow Modeling May 2, 2020 Knowledge Distillation
Code Code Available 1Hierarchical Self-supervised Augmented Knowledge Distillation Jul 29, 2021 Knowledge Distillation Representation Learning
Code Code Available 1Comprehensive Knowledge Distillation with Causal Intervention Dec 1, 2021 Causal Inference Knowledge Distillation
Code Code Available 1Densely Guided Knowledge Distillation using Multiple Teacher Assistants Sep 18, 2020 Knowledge Distillation Model Compression
Code Code Available 1Honest-but-Curious Nets: Sensitive Attributes of Private Inputs Can Be Secretly Coded into the Classifiers' Outputs May 25, 2021 Attribute Knowledge Distillation
Code Code Available 1How to Distill your BERT: An Empirical Study on the Impact of Weight Initialisation and Distillation Objectives May 24, 2023 Knowledge Distillation QNLI
Code Code Available 1How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Sep 13, 2021 Adversarial Robustness All
Code Code Available 1Deep Structured Instance Graph for Distilling Object Detectors Sep 27, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1AgeFlow: Conditional Age Progression and Regression with Normalizing Flows May 15, 2021 Attribute Knowledge Distillation
Code Code Available 1I^3 Retriever: Incorporating Implicit Interaction in Pre-trained Language Models for Passage Retrieval Jun 4, 2023 Knowledge Distillation Passage Retrieval
Code Code Available 1IDa-Det: An Information Discrepancy-aware Distillation for 1-bit Detectors Oct 7, 2022 Knowledge Distillation object-detection
Code Code Available 1ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence May 7, 2025 Knowledge Distillation
Code Code Available 1Improve Cross-Architecture Generalization on Dataset Distillation Feb 20, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 1Improved Techniques for Training Adaptive Deep Networks Aug 17, 2019 Computational Efficiency Knowledge Distillation
Code Code Available 1Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors Jan 1, 2021 image-classification Image Classification
Code Code Available 1A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone May 19, 2025 Knowledge Distillation Transfer Learning
Code Code Available 1Improving Knowledge Distillation via Regularizing Feature Norm and Direction May 26, 2023 Domain Adaptation Knowledge Distillation
Code Code Available 1Improving Neural Cross-Lingual Summarization via Employing Optimal Transport Distance for Knowledge Distillation Dec 7, 2021 Knowledge Distillation Multi-Task Learning
Code Code Available 1Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 1Defocus Blur Detection via Depth Distillation Jul 16, 2020 Decoder Defocus Blur Detection
Code Code Available 1Camera clustering for scalable stream-based active distillation Apr 16, 2024 Clustering Knowledge Distillation
Code Code Available 1