InDistill: Information flow-preserving knowledge distillation for model compression May 20, 2022 Knowledge Distillation Model Compression
Code Code Available 05 Domain Generalization for Crop Segmentation with Standardized Ensemble Knowledge Distillation Apr 3, 2023 Domain Generalization Knowledge Distillation
Code Code Available 05 Hybrid Attention Model Using Feature Decomposition and Knowledge Distillation for Glucose Forecasting Nov 16, 2024 Knowledge Distillation
Code Code Available 05 Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 05 Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 05 Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Sep 13, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 05 HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 05 Domain Adaptable Fine-Tune Distillation Framework For Advancing Farm Surveillance Feb 10, 2024 Computational Efficiency Knowledge Distillation
Code Code Available 05 HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 05 DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation May 26, 2025 Knowledge Distillation
Code Code Available 05 Context Unaware Knowledge Distillation for Image Retrieval Jul 19, 2022 Image Retrieval Knowledge Distillation
Code Code Available 05 Does Training with Synthetic Data Truly Protect Privacy? Feb 18, 2025 Data-free Knowledge Distillation Dataset Distillation
Code Code Available 05 HTR-JAND: Handwritten Text Recognition with Joint Attention Network and Knowledge Distillation Dec 24, 2024 Computational Efficiency Handwritten Text Recognition
Code Code Available 05 Approximating Interactive Human Evaluation with Self-Play for Open-Domain Dialog Systems Jun 21, 2019 Dialogue Evaluation Knowledge Distillation
Code Code Available 05 Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective Apr 22, 2024 Contrastive Learning image-classification
Code Code Available 05 DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection Mar 31, 2024 Dimensionality Reduction Knowledge Distillation
Code Code Available 05 Exploiting CLIP for Zero-shot HOI Detection Requires Knowledge Distillation at Multiple Levels Sep 10, 2023 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 05 Low-Cost Self-Ensembles Based on Multi-Branch Transformation and Grouped Convolution Aug 5, 2024 Classification Diversity
Code Code Available 05 How Knowledge Distillation Mitigates the Synthetic Gap in Fair Face Recognition Aug 30, 2024 Face Recognition Fairness
Code Code Available 05 How to Train the Teacher Model for Effective Knowledge Distillation Jul 25, 2024 Knowledge Distillation
Code Code Available 05 HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 05 Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 05 Advancing Compressed Video Action Recognition through Progressive Knowledge Distillation Jul 2, 2024 Action Recognition Knowledge Distillation
Code Code Available 05 Holistic White-light Polyp Classification via Alignment-free Dense Distillation of Auxiliary Optical Chromoendoscopy May 25, 2025 Diagnostic Knowledge Distillation
Code Code Available 05 Applying Knowledge Distillation to Improve Weed Mapping With Drones Oct 8, 2023 Knowledge Distillation Management
Code Code Available 05 Chemical transformer compression for accelerating both training and inference of molecular modeling May 16, 2022 Knowledge Distillation Model Compression
Code Code Available 05 Distribution Aligned Semantics Adaption for Lifelong Person Re-Identification May 30, 2024 Knowledge Distillation Person Re-Identification
Code Code Available 05 Distributed Soft Actor-Critic with Multivariate Reward Representation and Knowledge Distillation Nov 29, 2019 Knowledge Distillation reinforcement-learning
Code Code Available 05 Exploring Hyperspectral Anomaly Detection with Human Vision: A Small Target Aware Detector Jan 2, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 05 TinyBERT: Distilling BERT for Natural Language Understanding Sep 23, 2019 Knowledge Distillation Language Modelling
Code Code Available 05 HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 05 Induced Model Matching: How Restricted Models Can Help Larger Ones Feb 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 05 Not Far Away, Not So Close: Sample Efficient Nearest Neighbour Data Augmentation via MiniMax May 28, 2021 Data Augmentation Knowledge Distillation
Code Code Available 05 Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 05 GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 05 GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 05 Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing May 31, 2021 Knowledge Distillation Unsupervised Pre-training
Code Code Available 05 Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 05 Exploring Social Media for Early Detection of Depression in COVID-19 Patients Feb 23, 2023 Knowledge Distillation
Code Code Available 05 Exploring Target Representations for Masked Autoencoders Sep 8, 2022 Image Classification Instance Segmentation
Code Code Available 05 Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 05 Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 05 Graph Entropy Minimization for Semi-supervised Node Classification May 31, 2023 Classification Knowledge Distillation
Code Code Available 05 Distill n' Explain: explaining graph neural networks using simple surrogates Mar 17, 2023 Knowledge Distillation
Code Code Available 05 GOTHAM: Graph Class Incremental Learning Framework under Weak Supervision Apr 7, 2025 Attribute class-incremental learning
Code Code Available 05 Distilling Virtual Examples for Long-tailed Recognition Mar 28, 2021 Knowledge Distillation Long-tail Learning
Code Code Available 05 Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data Jul 7, 2023 Knowledge Distillation Model Compression
Code Code Available 05 Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 05 A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity Recognition Apr 2, 2022 Contrastive Learning Cross-Lingual NER
Code Code Available 05