| Effective Whole-body Pose Estimation with Two-stages Distillation | Jul 29, 2023 | 2D Human Pose EstimationKnowledge Distillation | CodeCode Available | 4 | 5 |
| Towards A Generalizable Pathology Foundation Model via Unified Knowledge Distillation | Jul 26, 2024 | Knowledge DistillationQuestion Answering | CodeCode Available | 2 | 5 |
| Regularizing Class-wise Predictions via Self-knowledge Distillation | Mar 31, 2020 | image-classificationImage Classification | CodeCode Available | 1 | 5 |
| Robust Spatiotemporal Traffic Forecasting with Reinforced Dynamic Adversarial Training | Jun 25, 2023 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 1 | 5 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation | Feb 5, 2024 | Knowledge DistillationRetrieval | CodeCode Available | 1 | 5 |
| Self-Knowledge Distillation with Progressive Refinement of Targets | Jun 22, 2020 | image-classificationImage Classification | CodeCode Available | 1 | 5 |
| CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation | May 1, 2024 | Image SegmentationKnowledge Distillation | CodeCode Available | 1 | 5 |
| FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning | Aug 24, 2023 | Continual LearningFederated Learning | CodeCode Available | 1 | 5 |
| Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression Recognition | Jun 25, 2024 | Knowledge DistillationMicro Expression Recognition | CodeCode Available | 1 | 5 |
| Graph-based Knowledge Distillation: A survey and experimental evaluation | Feb 27, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| DualFair: Fair Representation Learning at Both Group and Individual Levels via Contrastive Self-supervision | Mar 15, 2023 | counterfactualFairness | CodeCode Available | 1 | 5 |
| MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition | Aug 11, 2022 | Data Augmentationimage-classification | CodeCode Available | 1 | 5 |
| Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised Learning | Sep 30, 2022 | ECG ClassificationKnowledge Distillation | CodeCode Available | 1 | 5 |
| Noisy Self-Knowledge Distillation for Text Summarization | Sep 15, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 | 5 |
| ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks | May 7, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation | Mar 15, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 1 | 5 |
| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation | Jun 25, 2025 | Federated LearningKnowledge Distillation | CodeCode Available | 0 | 5 |
| Distilled Gradual Pruning with Pruned Fine-tuning | Feb 15, 2024 | Image ClassificationKnowledge Distillation | CodeCode Available | 0 | 5 |
| Incorporating Graph Information in Transformer-based AMR Parsing | Jun 23, 2023 | Abstract Meaning RepresentationAMR Parsing | CodeCode Available | 0 | 5 |
| RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation | Aug 22, 2022 | Data AugmentationDomain Adaptation | CodeCode Available | 0 | 5 |
| SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillation | Jul 29, 2024 | DecoderKnowledge Distillation | CodeCode Available | 0 | 5 |
| Robust and Accurate Object Detection via Self-Knowledge Distillation | Nov 14, 2021 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 0 | 5 |
| Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidance | Dec 7, 2023 | Domain AdaptationKnowledge Distillation | CodeCode Available | 0 | 5 |
| Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature Extractor | Jan 21, 2025 | DiagnosticKnowledge Distillation | CodeCode Available | 0 | 5 |
| Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning | May 31, 2024 | Action RecognitionContrastive Learning | CodeCode Available | 0 | 5 |
| Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild | Aug 3, 2023 | AttributeDescriptive | CodeCode Available | 0 | 5 |
| Lightweight Self-Knowledge Distillation with Multi-source Information Fusion | May 16, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning | Sep 16, 2024 | Few-Shot Learningimage-classification | CodeCode Available | 0 | 5 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 | 5 |
| Deep Clustering with Diffused Sampling and Hardness-aware Self-distillation | Jan 25, 2024 | ClusteringContrastive Learning | CodeCode Available | 0 | 5 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation | Sep 29, 2023 | Cross-Lingual Question AnsweringCross-Lingual Transfer | CodeCode Available | 0 | 5 |
| Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation | Jun 12, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | CodeCode Available | 0 | 5 |
| You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement | Jan 1, 2023 | Contrastive LearningImage Enhancement | —Unverified | 0 | 0 |
| A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition | Sep 3, 2022 | Action RecognitionKnowledge Distillation | —Unverified | 0 | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation | Mar 18, 2023 | Autonomous DrivingDomain Adaptation | —Unverified | 0 | 0 |
| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 | 0 |
| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 | 0 |
| Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation | Mar 10, 2022 | DecoderKnowledge Distillation | —Unverified | 0 | 0 |
| MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation | Mar 26, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 | 0 |