| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 | 0 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 | 0 |
| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 | 0 |
| Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation | Mar 10, 2022 | DecoderKnowledge Distillation | —Unverified | 0 | 0 |
| MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation | Mar 26, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 | 0 |