| Effective Whole-body Pose Estimation with Two-stages Distillation | Jul 29, 2023 | 2D Human Pose EstimationKnowledge Distillation | CodeCode Available | 4 |
| Towards A Generalizable Pathology Foundation Model via Unified Knowledge Distillation | Jul 26, 2024 | Knowledge DistillationQuestion Answering | CodeCode Available | 2 |
| Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression Recognition | Jun 25, 2024 | Knowledge DistillationMicro Expression Recognition | CodeCode Available | 1 |
| CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation | May 1, 2024 | Image SegmentationKnowledge Distillation | CodeCode Available | 1 |
| BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation | Feb 5, 2024 | Knowledge DistillationRetrieval | CodeCode Available | 1 |
| FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning | Aug 24, 2023 | Continual LearningFederated Learning | CodeCode Available | 1 |
| Robust Spatiotemporal Traffic Forecasting with Reinforced Dynamic Adversarial Training | Jun 25, 2023 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 1 |
| DualFair: Fair Representation Learning at Both Group and Individual Levels via Contrastive Self-supervision | Mar 15, 2023 | counterfactualFairness | CodeCode Available | 1 |
| Graph-based Knowledge Distillation: A survey and experimental evaluation | Feb 27, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised Learning | Sep 30, 2022 | ECG ClassificationKnowledge Distillation | CodeCode Available | 1 |
| MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition | Aug 11, 2022 | Data Augmentationimage-classification | CodeCode Available | 1 |
| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 |
| Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation | Mar 15, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 1 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Noisy Self-Knowledge Distillation for Text Summarization | Sep 15, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Self-Knowledge Distillation with Progressive Refinement of Targets | Jun 22, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks | May 7, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Regularizing Class-wise Predictions via Self-knowledge Distillation | Mar 31, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable Aggregation | Jun 25, 2025 | Federated LearningKnowledge Distillation | CodeCode Available | 0 |
| MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation | Mar 26, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 |
| xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation | Mar 12, 2025 | Knowledge DistillationLanguage Modeling | —Unverified | 0 |
| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature Extractor | Jan 21, 2025 | DiagnosticKnowledge Distillation | CodeCode Available | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 |
| Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach | Oct 17, 2024 | Earth ObservationFederated Learning | —Unverified | 0 |
| Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning | Sep 16, 2024 | Few-Shot Learningimage-classification | CodeCode Available | 0 |
| SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillation | Jul 29, 2024 | DecoderKnowledge Distillation | CodeCode Available | 0 |
| SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots | Jun 20, 2024 | In-Context LearningKnowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation for Learning Ambiguity | Jun 14, 2024 | Knowledge DistillationNatural Language Understanding | —Unverified | 0 |
| Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation | Jun 12, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | CodeCode Available | 0 |
| Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning | May 31, 2024 | Action RecognitionContrastive Learning | CodeCode Available | 0 |
| Weakly Supervised Monocular 3D Detection with a Single-View Image | Feb 29, 2024 | Knowledge DistillationObject Localization | —Unverified | 0 |
| Distilled Gradual Pruning with Pruned Fine-tuning | Feb 15, 2024 | Image ClassificationKnowledge Distillation | CodeCode Available | 0 |
| Deep Clustering with Diffused Sampling and Hardness-aware Self-distillation | Jan 25, 2024 | ClusteringContrastive Learning | CodeCode Available | 0 |
| X Modality Assisting RGBT Object Tracking | Dec 27, 2023 | Knowledge DistillationObject | —Unverified | 0 |
| Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidance | Dec 7, 2023 | Domain AdaptationKnowledge Distillation | CodeCode Available | 0 |
| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation | Sep 29, 2023 | Cross-Lingual Question AnsweringCross-Lingual Transfer | CodeCode Available | 0 |
| Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild | Aug 3, 2023 | AttributeDescriptive | CodeCode Available | 0 |
| Three Factors to Improve Out-of-Distribution Detection | Aug 2, 2023 | Contrastive LearningKnowledge Distillation | —Unverified | 0 |
| Incorporating Graph Information in Transformer-based AMR Parsing | Jun 23, 2023 | Abstract Meaning RepresentationAMR Parsing | CodeCode Available | 0 |
| Self-Knowledge Distillation for Surgical Phase Recognition | Jun 15, 2023 | DecoderKnowledge Distillation | —Unverified | 0 |
| Lightweight Self-Knowledge Distillation with Multi-source Information Fusion | May 16, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation | Mar 18, 2023 | Autonomous DrivingDomain Adaptation | —Unverified | 0 |
| You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement | Jan 1, 2023 | Contrastive LearningImage Enhancement | —Unverified | 0 |
| Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling | Dec 12, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation | Nov 20, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation | Sep 14, 2022 | Activity RecognitionHuman Activity Recognition | —Unverified | 0 |
| A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition | Sep 3, 2022 | Action RecognitionKnowledge Distillation | —Unverified | 0 |