| Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidance | Dec 7, 2023 | Domain AdaptationKnowledge Distillation | CodeCode Available | 0 | 5 |
| Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature Extractor | Jan 21, 2025 | DiagnosticKnowledge Distillation | CodeCode Available | 0 | 5 |
| Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning | May 31, 2024 | Action RecognitionContrastive Learning | CodeCode Available | 0 | 5 |
| Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild | Aug 3, 2023 | AttributeDescriptive | CodeCode Available | 0 | 5 |
| Lightweight Self-Knowledge Distillation with Multi-source Information Fusion | May 16, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning | Sep 16, 2024 | Few-Shot Learningimage-classification | CodeCode Available | 0 | 5 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 | 5 |
| Deep Clustering with Diffused Sampling and Hardness-aware Self-distillation | Jan 25, 2024 | ClusteringContrastive Learning | CodeCode Available | 0 | 5 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |
| Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation | Sep 29, 2023 | Cross-Lingual Question AnsweringCross-Lingual Transfer | CodeCode Available | 0 | 5 |
| Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation | Jun 12, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | CodeCode Available | 0 | 5 |
| You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement | Jan 1, 2023 | Contrastive LearningImage Enhancement | —Unverified | 0 | 0 |
| A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition | Sep 3, 2022 | Action RecognitionKnowledge Distillation | —Unverified | 0 | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation | Mar 18, 2023 | Autonomous DrivingDomain Adaptation | —Unverified | 0 | 0 |
| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 | 0 |
| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 | 0 |
| Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation | Mar 10, 2022 | DecoderKnowledge Distillation | —Unverified | 0 | 0 |
| MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation | Mar 26, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 | 0 |