Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 05 Few Sample Knowledge Distillation for Efficient Network Compression Dec 5, 2018 Knowledge Distillation Network Pruning
Code Code Available 05 Accelerated Proton Resonance Frequency-based Magnetic Resonance Thermometry by Optimized Deep Learning Method Jul 3, 2024 Knowledge Distillation
Code Code Available 05 Knowledge Distillation from Single to Multi Labels: an Empirical Study Mar 15, 2023 Classification image-classification
Code Code Available 05 Knowledge Distillation Layer that Lets the Student Decide Sep 6, 2023 Knowledge Distillation
Code Code Available 05 AttriPrompter: Auto-Prompting with Attribute Semantics for Zero-shot Nuclei Detection via Visual-Language Pre-trained Models Oct 22, 2024 Attribute Knowledge Distillation
Code Code Available 05 Content Based Singing Voice Extraction From a Musical Mixture Feb 12, 2020 Decoder Deep Learning
Code Code Available 05 Attentive Task Interaction Network for Multi-Task Learning Jan 25, 2022 Decoder Knowledge Distillation
Code Code Available 05 AdaGMLP: AdaBoosting GNN-to-MLP Knowledge Distillation May 23, 2024 Knowledge Distillation
Code Code Available 05 Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT Nov 11, 2022 Image Segmentation Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Singing Voice Detection Nov 9, 2020 Information Retrieval Knowledge Distillation
Code Code Available 05 Attention to detail: inter-resolution knowledge distillation Jan 11, 2024 Knowledge Distillation whole slide images
Code Code Available 05 Knowledge Distillation for End-to-End Person Search Sep 3, 2019 Knowledge Distillation Model Compression
Code Code Available 05 Knowledge Distillation for Multi-Target Domain Adaptation in Real-Time Person Re-Identification May 12, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Quality Estimation Jul 1, 2021 Data Augmentation Knowledge Distillation
Code Code Available 05 Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 05 Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 05 Knowledge Distillation by On-the-Fly Native Ensemble Jun 12, 2018 Computational Efficiency image-classification
Code Code Available 05 Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 05 CONetV2: Efficient Auto-Channel Size Optimization for CNNs Oct 13, 2021 Knowledge Distillation Neural Architecture Search
Code Code Available 05 Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 05 Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 05 Attention-Based Depth Distillation with 3D-Aware Positional Encoding for Monocular 3D Object Detection Nov 30, 2022 3D Object Detection Depth Estimation
Code Code Available 05 Attend, Distill, Detect: Attention-aware Entropy Distillation for Anomaly Detection May 10, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 05 AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search Jan 13, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 05 KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 05 ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation Jul 5, 2022 Image Segmentation Knowledge Distillation
Code Code Available 05 Is Smaller Always Faster? Tradeoffs in Compressing Self-Supervised Speech Transformers Nov 17, 2022 Knowledge Distillation Model Compression
Code Code Available 05 Joint Pre-training and Local Re-training: Transferable Representation Learning on Multi-source Knowledge Graphs Jun 5, 2023 Entity Alignment Knowledge Distillation
Code Code Available 05 A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation Mar 6, 2024 Knowledge Distillation
Code Code Available 05 Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation May 16, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 05 A Tailored Pre-Training Model for Task-Oriented Dialog Generation Apr 24, 2020 Knowledge Distillation Language Modeling
Code Code Available 05 KDMOS:Knowledge Distillation for Motion Segmentation Jun 17, 2025 Autonomous Driving Knowledge Distillation
Code Code Available 05 A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training May 3, 2023 Knowledge Distillation Text Generation
Code Code Available 05 Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation Mar 27, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 05 Complex Facial Expression Recognition Using Deep Knowledge Distillation of Basic Features Aug 11, 2023 Continual Learning Emotion Recognition
Code Code Available 05 Comb, Prune, Distill: Towards Unified Pruning for Vision Model Compression Aug 6, 2024 image-classification Image Classification
Code Code Available 05 Invariant debiasing learning for recommendation via biased imputation Dec 28, 2024 Imputation Knowledge Distillation
Code Code Available 05 Joint Answering and Explanation for Visual Commonsense Reasoning Feb 25, 2022 Knowledge Distillation Question Answering
Code Code Available 05 Asymmetric Masked Distillation for Pre-Training Small Foundation Models Nov 6, 2023 Action Classification Action Recognition
Code Code Available 05 Intra-class Patch Swap for Self-Distillation May 20, 2025 image-classification Image Classification
Code Code Available 05 Active Object Detection with Knowledge Aggregation and Distillation from Large Models May 21, 2024 Active Object Detection Decision Making
Code Code Available 05 Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision Sep 3, 2021 Continual Learning Contrastive Learning
Code Code Available 05 Interpreting Microbiome Relative Abundance Data Using Symbolic Regression Oct 18, 2024 Diagnostic Knowledge Distillation
Code Code Available 05 Interpreting and Disentangling Feature Components of Various Complexity from DNNs Jun 29, 2020 Knowledge Distillation
Code Code Available 05 Comparative Knowledge Distillation Nov 3, 2023 Data Augmentation Knowledge Distillation
Code Code Available 05 Compact Trilinear Interaction for Visual Question Answering Sep 26, 2019 Benchmarking Knowledge Distillation
Code Code Available 05 Asymmetrical Reciprocity-based Federated Learning for Resolving Disparities in Medical Diagnosis Dec 27, 2024 Diagnostic Federated Learning
Code Code Available 05 Instance Temperature Knowledge Distillation Jun 27, 2024 Decision Making Efficient Exploration
Code Code Available 05 Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism Oct 19, 2020 Decoder Knowledge Distillation
Code Code Available 05