SOTAVerified

Self-Knowledge Distillation

Papers

Showing 1120 of 68 papers

TitleStatusHype
MixSKD: Self-Knowledge Distillation from Mixup for Image RecognitionCode1
Preservation of the Global Knowledge by Not-True Distillation in Federated LearningCode1
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge DistillationCode1
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationCode1
Noisy Self-Knowledge Distillation for Text SummarizationCode1
Self-Knowledge Distillation with Progressive Refinement of TargetsCode1
ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural NetworksCode1
Regularizing Class-wise Predictions via Self-knowledge DistillationCode1
Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable AggregationCode0
MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation0
Show:102550
← PrevPage 2 of 7Next →

No leaderboard results yet.