| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 | 5 |
| BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation | Feb 5, 2024 | Knowledge DistillationRetrieval | CodeCode Available | 1 | 5 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation | May 1, 2024 | Image SegmentationKnowledge Distillation | CodeCode Available | 1 | 5 |
| Regularizing Class-wise Predictions via Self-knowledge Distillation | Mar 31, 2020 | image-classificationImage Classification | CodeCode Available | 1 | 5 |
| Noisy Self-Knowledge Distillation for Text Summarization | Sep 15, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning | Aug 24, 2023 | Continual LearningFederated Learning | CodeCode Available | 1 | 5 |
| ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks | May 7, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 | 5 |
| Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature Extractor | Jan 21, 2025 | DiagnosticKnowledge Distillation | CodeCode Available | 0 | 5 |
| From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels | Mar 23, 2023 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 | 5 |