Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks May 5, 2022 Knowledge Distillation
Code Code Available 0Knowledge Distillation for Singing Voice Detection Nov 9, 2020 Information Retrieval Knowledge Distillation
Code Code Available 0TinyBERT: Distilling BERT for Natural Language Understanding Sep 23, 2019 Knowledge Distillation Language Modelling
Code Code Available 0Theory and Experiments on Vector Quantized Autoencoders May 28, 2018 Image Generation Knowledge Distillation
Code Code Available 0Knowledge Distillation for Quality Estimation Jul 1, 2021 Data Augmentation Knowledge Distillation
Code Code Available 0Whole-slide-imaging Cancer Metastases Detection and Localization with Limited Tumorous Data Mar 18, 2023 Knowledge Distillation Medical Image Analysis
Code Code Available 0Lightweight Self-Knowledge Distillation with Multi-source Information Fusion May 16, 2023 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0ThermoStereoRT: Thermal Stereo Matching in Real Time via Knowledge Distillation and Attention-based Refinement Apr 10, 2025 Knowledge Distillation Stereo Matching
Code Code Available 0Content Based Singing Voice Extraction From a Musical Mixture Feb 12, 2020 Decoder Deep Learning
Code Code Available 0Knowledge Distillation for Multi-Target Domain Adaptation in Real-Time Person Re-Identification May 12, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 0LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher Insights for Bangla Handwriting Recognition May 23, 2022 Handwriting Recognition Knowledge Distillation
Code Code Available 0A Survey on the Robustness of Computer Vision Models against Common Corruptions May 10, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0SKDCGN: Source-free Knowledge Distillation of Counterfactual Generative Networks using cGANs Aug 8, 2022 counterfactual Knowledge Distillation
Code Code Available 0PyNET-QxQ: An Efficient PyNET Variant for QxQ Bayer Pattern Demosaicing in CMOS Image Sensors Mar 8, 2022 Demosaicking Knowledge Distillation
Code Code Available 0Knowledge Distillation for End-to-End Person Search Sep 3, 2019 Knowledge Distillation Model Compression
Code Code Available 0CONetV2: Efficient Auto-Channel Size Optimization for CNNs Oct 13, 2021 Knowledge Distillation Neural Architecture Search
Code Code Available 0Answering Diverse Questions via Text Attached with Key Audio-Visual Clues Mar 11, 2024 Audio-visual Question Answering Audio-Visual Question Answering (AVQA)
Code Code Available 0Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 0Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 0LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum Training Apr 29, 2021 Knowledge Distillation Position
Code Code Available 0Domain Adaptable Fine-Tune Distillation Framework For Advancing Farm Surveillance Feb 10, 2024 Computational Efficiency Knowledge Distillation
Code Code Available 0SkinDistilViT: Lightweight Vision Transformer for Skin Lesion Classification Aug 16, 2023 Cancer Classification Classification
Code Code Available 0The State of Knowledge Distillation for Classification Dec 20, 2019 Classification Data Augmentation
Code Code Available 0SlideGCD: Slide-based Graph Collaborative Training with Knowledge Distillation for Whole Slide Image Classification Jul 12, 2024 graph construction Graph Learning
Code Code Available 0Knowledge Distillation by On-the-Fly Native Ensemble Jun 12, 2018 Computational Efficiency image-classification
Code Code Available 0TOP-Training: Target-Oriented Pretraining for Medical Extractive Question Answering Oct 25, 2023 Domain Adaptation Extractive Question-Answering
Code Code Available 0Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 0Slimmable Networks for Contrastive Self-supervised Learning Sep 30, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 0SlimNets: An Exploration of Deep Model Compression and Acceleration Aug 1, 2018 Knowledge Distillation Model Compression
Code Code Available 0DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation May 26, 2025 Knowledge Distillation
Code Code Available 0The Trilemma of Truth in Large Language Models Jun 30, 2025 Attribute Conformal Prediction
Code Code Available 0Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 0Knowledge Distillation approach towards Melanoma Detection Oct 14, 2022 Knowledge Distillation TAG
Code Code Available 0Is Smaller Always Faster? Tradeoffs in Compressing Self-Supervised Speech Transformers Nov 17, 2022 Knowledge Distillation Model Compression
Code Code Available 0LLMQuoter: Enhancing RAG Capabilities Through Efficient Quote Extraction From Large Contexts Jan 9, 2025 Knowledge Distillation RAG
Code Code Available 0Complex Facial Expression Recognition Using Deep Knowledge Distillation of Basic Features Aug 11, 2023 Continual Learning Emotion Recognition
Code Code Available 0Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods May 4, 2023 3D Semantic Segmentation Knowledge Distillation
Code Code Available 0QUEST: Quantized embedding space for transferring knowledge Dec 3, 2019 Knowledge Distillation
Code Code Available 0KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 0KDMOS:Knowledge Distillation for Motion Segmentation Jun 17, 2025 Autonomous Driving Knowledge Distillation
Code Code Available 0Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation May 16, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 0Localized Symbolic Knowledge Distillation for Visual Commonsense Models Dec 8, 2023 Image Description Instruction Following
Code Code Available 0Locally Differentially Private Distributed Deep Learning via Knowledge Distillation Feb 7, 2022 Deep Learning Knowledge Distillation
Code Code Available 0Zero-Shot Knowledge Distillation in Deep Networks May 20, 2019 Knowledge Distillation
Code Code Available 0QuIIL at T3 challenge: Towards Automation in Life-Saving Intervention Procedures from First-Person View Jul 18, 2024 Action Anticipation Action Recognition
Code Code Available 0A Lightweight Target-Driven Network of Stereo Matching for Inland Waterways Oct 10, 2024 Autonomous Navigation Knowledge Distillation
Code Code Available 0Visual Relationship Detection with Language prior and Softmax Apr 16, 2019 Knowledge Distillation Relationship Detection
Code Code Available 0Does Training with Synthetic Data Truly Protect Privacy? Feb 18, 2025 Data-free Knowledge Distillation Dataset Distillation
Code Code Available 0Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision Sep 3, 2021 Continual Learning Contrastive Learning
Code Code Available 0Annealing Knowledge Distillation Apr 14, 2021 image-classification Image Classification
Code Code Available 0