SOTAVerified

Self-Knowledge Distillation

Papers

Showing 1120 of 68 papers

TitleStatusHype
Preservation of the Global Knowledge by Not-True Distillation in Federated LearningCode1
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationCode1
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
Regularizing Class-wise Predictions via Self-knowledge DistillationCode1
Noisy Self-Knowledge Distillation for Text SummarizationCode1
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated LearningCode1
ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural NetworksCode1
Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature ExtractorCode0
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft LabelsCode0
Show:102550
← PrevPage 2 of 7Next →

No leaderboard results yet.