| Effective Whole-body Pose Estimation with Two-stages Distillation | Jul 29, 2023 | 2D Human Pose EstimationKnowledge Distillation | CodeCode Available | 4 |
| Towards A Generalizable Pathology Foundation Model via Unified Knowledge Distillation | Jul 26, 2024 | Knowledge DistillationQuestion Answering | CodeCode Available | 2 |
| Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression Recognition | Jun 25, 2024 | Knowledge DistillationMicro Expression Recognition | CodeCode Available | 1 |
| Graph-based Knowledge Distillation: A survey and experimental evaluation | Feb 27, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 |
| Self-Knowledge Distillation with Progressive Refinement of Targets | Jun 22, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| Noisy Self-Knowledge Distillation for Text Summarization | Sep 15, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| DualFair: Fair Representation Learning at Both Group and Individual Levels via Contrastive Self-supervision | Mar 15, 2023 | counterfactualFairness | CodeCode Available | 1 |
| Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised Learning | Sep 30, 2022 | ECG ClassificationKnowledge Distillation | CodeCode Available | 1 |
| MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition | Aug 11, 2022 | Data Augmentationimage-classification | CodeCode Available | 1 |
| Robust Spatiotemporal Traffic Forecasting with Reinforced Dynamic Adversarial Training | Jun 25, 2023 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 1 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation | Feb 5, 2024 | Knowledge DistillationRetrieval | CodeCode Available | 1 |
| CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation | May 1, 2024 | Image SegmentationKnowledge Distillation | CodeCode Available | 1 |
| Regularizing Class-wise Predictions via Self-knowledge Distillation | Mar 31, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation | Mar 15, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 1 |
| FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning | Aug 24, 2023 | Continual LearningFederated Learning | CodeCode Available | 1 |
| ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks | May 7, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling | Dec 12, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition | Sep 3, 2022 | Action RecognitionKnowledge Distillation | —Unverified | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation | Mar 18, 2023 | Autonomous DrivingDomain Adaptation | —Unverified | 0 |
| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 |
| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 |
| Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation | Mar 10, 2022 | DecoderKnowledge Distillation | —Unverified | 0 |
| MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation | Mar 26, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 |
| SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots | Jun 20, 2024 | In-Context LearningKnowledge Distillation | —Unverified | 0 |
| SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK | Sep 25, 2019 | Adversarial AttackKnowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images | Jun 7, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation for Learning Ambiguity | Jun 14, 2024 | Knowledge DistillationNatural Language Understanding | —Unverified | 0 |
| Self-Knowledge Distillation for Surgical Phase Recognition | Jun 15, 2023 | DecoderKnowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation in Natural Language Processing | Aug 2, 2019 | Deep LearningKnowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation via Dropout | Aug 11, 2022 | Adversarial Robustnessimage-classification | —Unverified | 0 |
| AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation | Nov 20, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection | Apr 14, 2022 | Knowledge DistillationMultiple Instance Learning | —Unverified | 0 |
| Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks | Sep 30, 2020 | image-classificationImage Classification | —Unverified | 0 |
| TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation | Sep 14, 2022 | Activity RecognitionHuman Activity Recognition | —Unverified | 0 |
| Three Factors to Improve Out-of-Distribution Detection | Aug 2, 2023 | Contrastive LearningKnowledge Distillation | —Unverified | 0 |
| Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach | Oct 17, 2024 | Earth ObservationFederated Learning | —Unverified | 0 |
| Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation | Mar 17, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Weakly Supervised Monocular 3D Detection with a Single-View Image | Feb 29, 2024 | Knowledge DistillationObject Localization | —Unverified | 0 |
| X Modality Assisting RGBT Object Tracking | Dec 27, 2023 | Knowledge DistillationObject | —Unverified | 0 |
| xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation | Mar 12, 2025 | Knowledge DistillationLanguage Modeling | —Unverified | 0 |
| You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement | Jan 1, 2023 | Contrastive LearningImage Enhancement | —Unverified | 0 |
| Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation | Jun 25, 2025 | Federated LearningKnowledge Distillation | CodeCode Available | 0 |