| Lightweight Self-Knowledge Distillation with Multi-source Information Fusion | May 16, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Incorporating Graph Information in Transformer-based AMR Parsing | Jun 23, 2023 | Abstract Meaning RepresentationAMR Parsing | CodeCode Available | 0 |
| Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation | Jun 12, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | CodeCode Available | 0 |
| Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation | Sep 29, 2023 | Cross-Lingual Question AnsweringCross-Lingual Transfer | CodeCode Available | 0 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning | Sep 16, 2024 | Few-Shot Learningimage-classification | CodeCode Available | 0 |
| Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild | Aug 3, 2023 | AttributeDescriptive | CodeCode Available | 0 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Robust and Accurate Object Detection via Self-Knowledge Distillation | Nov 14, 2021 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 0 |
| Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature Extractor | Jan 21, 2025 | DiagnosticKnowledge Distillation | CodeCode Available | 0 |
| SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillation | Jul 29, 2024 | DecoderKnowledge Distillation | CodeCode Available | 0 |
| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 |
| Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidance | Dec 7, 2023 | Domain AdaptationKnowledge Distillation | CodeCode Available | 0 |
| Distilled Gradual Pruning with Pruned Fine-tuning | Feb 15, 2024 | Image ClassificationKnowledge Distillation | CodeCode Available | 0 |
| Deep Clustering with Diffused Sampling and Hardness-aware Self-distillation | Jan 25, 2024 | ClusteringContrastive Learning | CodeCode Available | 0 |
| Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning | May 31, 2024 | Action RecognitionContrastive Learning | CodeCode Available | 0 |
| RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation | Aug 22, 2022 | Data AugmentationDomain Adaptation | CodeCode Available | 0 |
| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |