SOTAVerified

Self-Knowledge Distillation

Papers

Showing 110 of 68 papers

TitleStatusHype
Effective Whole-body Pose Estimation with Two-stages DistillationCode4
Towards A Generalizable Pathology Foundation Model via Unified Knowledge DistillationCode2
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationCode1
MixSKD: Self-Knowledge Distillation from Mixup for Image RecognitionCode1
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated LearningCode1
DualFair: Fair Representation Learning at Both Group and Individual Levels via Contrastive Self-supervisionCode1
Graph-based Knowledge Distillation: A survey and experimental evaluationCode1
Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised LearningCode1
Show:102550
← PrevPage 1 of 7Next →

No leaderboard results yet.