Attend, Distill, Detect: Attention-aware Entropy Distillation for Anomaly Detection May 10, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0FAKD: Feature Augmented Knowledge Distillation for Semantic Segmentation Aug 30, 2022 Knowledge Distillation Segmentation
Code Code Available 0On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models Apr 4, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0On the Transferability of Visual Features in Generalized Zero-Shot Learning Nov 22, 2022 Generalized Zero-Shot Learning Knowledge Distillation
Code Code Available 0A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation Mar 6, 2024 Knowledge Distillation
Code Code Available 0On the Use of External Data for Spoken Named Entity Recognition Dec 14, 2021 Knowledge Distillation named-entity-recognition
Code Code Available 0OpenGrok: Enhancing SNS Data Processing with Distilled Knowledge and Mask-like Mechanisms Feb 11, 2025 Knowledge Distillation MMLU
Code Code Available 0Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-Training of Deep Networks Oct 3, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 0Born Again Neural Networks May 12, 2018 Image Classification Knowledge Distillation
Code Code Available 0Data-Free Knowledge Distillation for Image Super-Resolution Jun 19, 2021 Data-free Knowledge Distillation Image Super-Resolution
Code Code Available 0Faithful Label-free Knowledge Distillation Nov 22, 2024 Inductive Bias Knowledge Distillation
Code Code Available 0Data-free Knowledge Distillation for Fine-grained Visual Categorization Apr 18, 2024 Data-free Knowledge Distillation Fine-Grained Visual Categorization
Code Code Available 0Self-Attentive Spatio-Temporal Calibration for Precise Intermediate Layer Matching in ANN-to-SNN Distillation Jan 14, 2025 Knowledge Distillation
Code Code Available 0Fairness without Demographics through Knowledge Distillation Nov 1, 2022 Fairness Knowledge Distillation
Code Code Available 0Towards Real-time Video Compressive Sensing on Mobile Devices Aug 14, 2024 Compressive Sensing Knowledge Distillation
Code Code Available 0Boosting Summarization with Normalizing Flows and Aggressive Training Nov 1, 2023 Decoder Knowledge Distillation
Code Code Available 0Self-Distillation for Gaussian Process Regression and Classification Apr 5, 2023 Classification GPR
Code Code Available 0Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN Nov 2, 2020 Data-free Knowledge Distillation Diversity
Code Code Available 0Optimal Transport Guided Correlation Assignment for Multimodal Entity Linking Jun 4, 2024 Entity Linking Knowledge Distillation
Code Code Available 0Teacher Agent: A Knowledge Distillation-Free Framework for Rehearsal-based Video Incremental Learning Jun 1, 2023 Incremental Learning Knowledge Distillation
Code Code Available 0Facilitating Pornographic Text Detection for Open-Domain Dialogue Systems via Knowledge Distillation of Large Language Models Mar 20, 2024 Chatbot Knowledge Distillation
Code Code Available 0Facilitating NSFW Text Detection in Open-Domain Dialogue Systems via Knowledge Distillation Sep 18, 2023 Chatbot Knowledge Distillation
Code Code Available 0Optimizing edge AI models on HPC systems with the edge in the loop May 26, 2025 Hardware Aware Neural Architecture Search Knowledge Distillation
Code Code Available 0Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation Feb 18, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers Feb 23, 2023 Knowledge Distillation Quantization
Code Code Available 0Teacher Network Calibration Improves Cross-Quality Knowledge Distillation Apr 15, 2023 image-classification Image Classification
Code Code Available 0Facial Landmark Points Detection Using Knowledge Distillation-Based Neural Networks Nov 13, 2021 Face Alignment Facial Landmark Detection
Code Code Available 0Boosting Residual Networks with Group Knowledge Aug 26, 2023 Knowledge Distillation
Code Code Available 0Exploiting the Semantic Knowledge of Pre-trained Text-Encoders for Continual Learning Aug 2, 2024 Continual Learning Knowledge Distillation
Code Code Available 0Model-Based Reinforcement Learning with Multi-Task Offline Pretraining Jun 6, 2023 Knowledge Distillation Model-based Reinforcement Learning
Code Code Available 0When Babies Teach Babies: Can student knowledge sharing outperform Teacher-Guided Distillation on small datasets? Nov 25, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild Aug 3, 2023 Attribute Descriptive
Code Code Available 0An Investigation of the Combination of Rehearsal and Knowledge Distillation in Continual Learning for Spoken Language Understanding Nov 15, 2022 class-incremental learning Class Incremental Learning
Code Code Available 0ORC: Network Group-based Knowledge Distillation using Online Role Change Jun 1, 2022 Knowledge Distillation
Code Code Available 0Data-Free Generative Replay for Class-Incremental Learning on Imbalanced Data Jun 7, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0Exploring Target Representations for Masked Autoencoders Sep 8, 2022 Image Classification Instance Segmentation
Code Code Available 0Adaptive Mixing of Auxiliary Losses in Supervised Learning Feb 7, 2022 Denoising Knowledge Distillation
Code Code Available 0Distilling the Unknown to Unveil Certainty Nov 14, 2023 Knowledge Distillation Out of Distribution (OOD) Detection
Code Code Available 0Exploring Social Media for Early Detection of Depression in COVID-19 Patients Feb 23, 2023 Knowledge Distillation
Code Code Available 0Exploring Non-Autoregressive Text Style Transfer Nov 1, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 0Exploring Hyperspectral Anomaly Detection with Human Vision: A Small Target Aware Detector Jan 2, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective Nov 16, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 0A Diversity-Enhanced Knowledge Distillation Model for Practical Math Word Problem Solving Jan 7, 2025 Diversity Knowledge Distillation
Code Code Available 0Overcoming Uncertain Incompleteness for Robust Multimodal Sequential Diagnosis Prediction via Curriculum Data Erasing Guided Knowledge Distillation Jul 28, 2024 Knowledge Distillation Sequential Diagnosis
Code Code Available 0Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation Nov 10, 2024 Knowledge Distillation Tensor Decomposition
Code Code Available 0Exploiting CLIP for Zero-shot HOI Detection Requires Knowledge Distillation at Multiple Levels Sep 10, 2023 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 0OVOSE: Open-Vocabulary Semantic Segmentation in Event-Based Cameras Aug 18, 2024 Autonomous Driving Domain Adaptation
Code Code Available 0Evolutionary Generative Adversarial Networks with Crossover Based Knowledge Distillation Jan 27, 2021 Knowledge Distillation
Code Code Available 0PruMUX: Augmenting Data Multiplexing with Model Compression May 24, 2023 Knowledge Distillation model
Code Code Available 0Weak-to-Strong 3D Object Detection with X-Ray Distillation Mar 31, 2024 3D Object Detection Autonomous Driving
Code Code Available 0