ReffAKD: Resource-efficient Autoencoder-based Knowledge Distillation Apr 15, 2024 Knowledge Distillation
Code Code Available 0Collaborative Learning of Bidirectional Decoders for Unsupervised Text Style Transfer Nov 1, 2021 Attribute Decoder
Code Code Available 0Refined Response Distillation for Class-Incremental Player Detection May 1, 2023 Knowledge Distillation object-detection
Code Code Available 0MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Face Images Nov 19, 2017 CPU Facial Expression Recognition
Code Code Available 0Image Recognition with Online Lightweight Vision Transformer: A Survey May 6, 2025 Knowledge Distillation Survey
Code Code Available 0Distilling Object Detectors With Global Knowledge Oct 17, 2022 Knowledge Distillation Object
Code Code Available 0Low-Energy On-Device Personalization for MCUs Mar 12, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0MIDAS: Multi-level Intent, Domain, And Slot Knowledge Distillation for Multi-turn NLU Aug 15, 2024 domain classification Intent Detection
Code Code Available 0Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 0Collaborative Deep Reinforcement Learning Feb 19, 2017 Deep Reinforcement Learning Knowledge Distillation
Code Code Available 0Cogni-Net: Cognitive Feature Learning through Deep Visual Perception Nov 1, 2018 EEG Electroencephalogram (EEG)
Code Code Available 0MimicGait: A Model Agnostic approach for Occluded Gait Recognition using Correlational Knowledge Distillation Jan 26, 2025 Gait Recognition Gait Recognition in the Wild
Code Code Available 0Regression-Oriented Knowledge Distillation for Lightweight Ship Orientation Angle Prediction with Optical Remote Sensing Images Jul 13, 2023 Knowledge Distillation Prediction
Code Code Available 0Distilling Object Detectors with Fine-grained Feature Imitation Jun 9, 2019 Knowledge Distillation Object
Code Code Available 0Hybrid Attention Model Using Feature Decomposition and Knowledge Distillation for Glucose Forecasting Nov 16, 2024 Knowledge Distillation
Code Code Available 0REHRSeg: Unleashing the Power of Self-Supervised Super-Resolution for Resource-Efficient 3D MRI Segmentation Oct 14, 2024 Knowledge Distillation Medical Image Analysis
Code Code Available 0HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 0Distilling Reasoning Capabilities into Smaller Language Models Dec 1, 2022 GSM8K Knowledge Distillation
Code Code Available 0Minimizing PLM-Based Few-Shot Intent Detectors Jul 13, 2024 Data Augmentation Knowledge Distillation
Code Code Available 0Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 0TSPipe: Learn from Teacher Faster with Pipelines Jul 17, 2022 GPU Knowledge Distillation
Code Code Available 0Reinforced Knowledge Distillation for Time Series Regression Jun 21, 2024 Knowledge Distillation Model Compression
Code Code Available 0A Flexible Multi-Task Model for BERT Serving Jul 12, 2021 Knowledge Distillation model
Code Code Available 0HTR-JAND: Handwritten Text Recognition with Joint Attention Network and Knowledge Distillation Dec 24, 2024 Computational Efficiency Handwritten Text Recognition
Code Code Available 0Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation Jun 2, 2021 Knowledge Distillation Translation
Code Code Available 0Relational Diffusion Distillation for Efficient Image Generation Oct 10, 2024 Image Generation Knowledge Distillation
Code Code Available 0Relational Knowledge Distillation Apr 10, 2019 Knowledge Distillation Metric Learning
Code Code Available 0HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 0Distilling Model Knowledge Oct 8, 2015 Bayesian Inference BIG-bench Machine Learning
Code Code Available 0How to Train the Teacher Model for Effective Knowledge Distillation Jul 25, 2024 Knowledge Distillation
Code Code Available 0MixedTeacher : Knowledge Distillation for fast inference textural anomaly detection Jun 16, 2023 Anomaly Detection Knowledge Distillation
Code Code Available 0Topology-Guided Knowledge Distillation for Efficient Point Cloud Processing May 12, 2025 3D Object Recognition Autonomous Driving
Code Code Available 0Distilling Local Texture Features for Colorectal Tissue Classification in Low Data Regimes Jan 2, 2024 Knowledge Distillation
Code Code Available 0Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0CL-XABSA: Contrastive Learning for Cross-lingual Aspect-based Sentiment Analysis Apr 2, 2022 Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA)
Code Code Available 0Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models Jul 28, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0Relative Difficulty Distillation for Semantic Segmentation Jul 4, 2024 Knowledge Distillation Semantic Segmentation
Code Code Available 0Self-Supervised Z-Slice Augmentation for 3D Bio-Imaging via Knowledge Distillation Mar 5, 2025 Generative Adversarial Network Knowledge Distillation
Code Code Available 0How Knowledge Distillation Mitigates the Synthetic Gap in Fair Face Recognition Aug 30, 2024 Face Recognition Fairness
Code Code Available 0Releasing Graph Neural Networks with Differential Privacy Guarantees Sep 18, 2021 Knowledge Distillation Privacy Preserving
Code Code Available 0Holistic White-light Polyp Classification via Alignment-free Dense Distillation of Auxiliary Optical Chromoendoscopy May 25, 2025 Diagnostic Knowledge Distillation
Code Code Available 0Distilling Knowledge for Empathy Detection Nov 1, 2021 Knowledge Distillation
Code Code Available 0RELIANT: Fair Knowledge Distillation for Graph Neural Networks Jan 3, 2023 Fairness Graph Learning
Code Code Available 0HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 0Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 0HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 0Distilling Knowledge for Designing Computational Imaging Systems Jan 29, 2025 Decoder Image Reconstruction
Code Code Available 0MOD: A Deep Mixture Model with Online Knowledge Distillation for Large Scale Video Temporal Concept Localization Oct 27, 2019 Knowledge Distillation Video Understanding
Code Code Available 0Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and Fusion Jul 23, 2022 Data-free Knowledge Distillation Fairness
Code Code Available 0Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering Jul 20, 2023 Clustering Data Augmentation
Code Code Available 0