SOTAVerified

Self-Knowledge Distillation

Papers

Showing 1120 of 68 papers

TitleStatusHype
ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural NetworksCode1
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
Self-Knowledge Distillation with Progressive Refinement of TargetsCode1
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
MixSKD: Self-Knowledge Distillation from Mixup for Image RecognitionCode1
Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised LearningCode1
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated LearningCode1
Graph-based Knowledge Distillation: A survey and experimental evaluationCode1
Extending Label Smoothing Regularization with Self-Knowledge Distillation0
Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation0
Show:102550
← PrevPage 2 of 7Next →

No leaderboard results yet.