SOTAVerified

Self-Knowledge Distillation

Papers

Showing 5160 of 68 papers

TitleStatusHype
Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-trainingCode0
Extracting knowledge from features with multilevel abstraction0
Robust and Accurate Object Detection via Self-Knowledge DistillationCode0
Student Helping Teacher: Teacher Evolution via Self-Knowledge DistillationCode0
Preservation of the Global Knowledge by Not-True Distillation in Federated LearningCode1
Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack0
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation0
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge DistillationCode1
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationCode1
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks0
Show:102550
← PrevPage 6 of 7Next →

No leaderboard results yet.