A Knowledge Distillation Ensemble Framework for Predicting Short and Long-term Hospitalisation Outcomes from Electronic Health Records Data Nov 18, 2020 Decision Making ICU Admission
Code Code Available 05 Knowledge Extraction with No Observable Data Dec 1, 2019 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 KnowledgeSG: Privacy-Preserving Synthetic Text Generation with Knowledge Distillation from Server Oct 8, 2024 Federated Learning Knowledge Distillation
Code Code Available 05 Autoregressive Knowledge Distillation through Imitation Learning Sep 15, 2020 Imitation Learning Knowledge Distillation
Code Code Available 05 Knowledge Distillation with Adversarial Samples Supporting Decision Boundary May 15, 2018 Adversarial Attack Knowledge Distillation
Code Code Available 05 Correlation Congruence for Knowledge Distillation Apr 3, 2019 Face Recognition image-classification
Code Code Available 05 TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks Apr 23, 2019 Image Generation Knowledge Distillation
Code Code Available 05 Knowledge distillation to effectively attain both region-of-interest and global semantics from an image where multiple objects appear Jul 11, 2024 Knowledge Distillation object-detection
Code Code Available 05 A Knowledge Distillation-Based Approach to Enhance Transparency of Classifier Models Feb 21, 2025 Decision Making Knowledge Distillation
Code Code Available 05 CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation Jul 6, 2021 Continual Learning Domain Adaptation
Code Code Available 05 Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation Sep 1, 2020 Data Augmentation Knowledge Distillation
Code Code Available 05 Knowledge Distillation via Instance Relationship Graph Jun 1, 2019 Knowledge Distillation
Code Code Available 05 Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression Oct 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 05 Cooperative Retriever and Ranker in Deep Recommenders Jun 28, 2022 Knowledge Distillation Recommendation Systems
Code Code Available 05 Automatic adaptation of object detectors to new domains using self-training Apr 15, 2019 Domain Adaptation Knowledge Distillation
Code Code Available 05 Cooperative Knowledge Distillation: A Learner Agnostic Approach Feb 2, 2024 counterfactual Knowledge Distillation
Code Code Available 05 Automated Knowledge Distillation via Monte Carlo Tree Search Jan 1, 2023 image-classification Image Classification
Code Code Available 05 Knowledge Distillation of Russian Language Models with Reduction of Vocabulary May 4, 2022 Knowledge Distillation
Code Code Available 05 Cooperative Classification and Rationalization for Graph Generalization Mar 10, 2024 Classification Graph Classification
Code Code Available 05 Knowledge Distillation Layer that Lets the Student Decide Sep 6, 2023 Knowledge Distillation
Code Code Available 05 Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 05 Knowledge Distillation Performs Partial Variance Reduction May 27, 2023 Knowledge Distillation
Code Code Available 05 On the Byzantine-Resilience of Distillation-Based Federated Learning Feb 19, 2024 Federated Learning Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Quality Estimation Jul 1, 2021 Data Augmentation Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Singing Voice Detection Nov 9, 2020 Information Retrieval Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Multi-Target Domain Adaptation in Real-Time Person Re-Identification May 12, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 05 Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 05 Contrastive Learning in Distilled Models Jan 23, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 05 Knowledge Distillation for End-to-End Person Search Sep 3, 2019 Knowledge Distillation Model Compression
Code Code Available 05 MiniDisc: Minimal Distillation Schedule for Language Model Compression May 29, 2022 Knowledge Distillation Language Modeling
Code Code Available 05 Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 05 Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias May 1, 2021 Knowledge Distillation Machine Translation
Code Code Available 05 Knowledge Distillation by On-the-Fly Native Ensemble Jun 12, 2018 Computational Efficiency image-classification
Code Code Available 05 AI-KD: Towards Alignment Invariant Face Image Quality Assessment Using Knowledge Distillation Apr 15, 2024 Face Alignment Face Image Quality
Code Code Available 05 Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 05 A Unified Object Counting Network with Object Occupation Prior Dec 29, 2022 Crowd Counting Knowledge Distillation
Code Code Available 05 Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 05 Continual Representation Learning for Biometric Identification Jun 8, 2020 Continual Learning Knowledge Distillation
Code Code Available 05 Continual Panoptic Perception: Towards Multi-modal Incremental Interpretation of Remote Sensing Images Jul 19, 2024 Caption Generation Continual Learning
Code Code Available 05 Continual Knowledge Distillation for Neural Machine Translation Dec 18, 2022 Knowledge Distillation Machine Translation
Code Code Available 05 KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 05 Leveraging Entity Information for Cross-Modality Correlation Learning: The Entity-Guided Multimodal Summarization Aug 6, 2024 Knowledge Distillation Language Modeling
Code Code Available 05 Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation May 16, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 05 Joint Pre-training and Local Re-training: Transferable Representation Learning on Multi-source Knowledge Graphs Jun 5, 2023 Entity Alignment Knowledge Distillation
Code Code Available 05 KDMOS:Knowledge Distillation for Motion Segmentation Jun 17, 2025 Autonomous Driving Knowledge Distillation
Code Code Available 05 Knowledge Distillation approach towards Melanoma Detection Oct 14, 2022 Knowledge Distillation TAG
Code Code Available 05 Continual Contrastive Learning for Image Classification Jul 5, 2021 Classification Continual Learning
Code Code Available 05 Continual Coarse-to-Fine Domain Adaptation in Semantic Segmentation Jan 18, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 05 AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation Mar 11, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05