SOTAVerified

Self-Knowledge Distillation

Papers

Showing 5168 of 68 papers

TitleStatusHype
Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-trainingCode0
Extracting knowledge from features with multilevel abstraction0
Robust and Accurate Object Detection via Self-Knowledge DistillationCode0
Student Helping Teacher: Teacher Evolution via Self-Knowledge DistillationCode0
Preservation of the Global Knowledge by Not-True Distillation in Federated LearningCode1
Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack0
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation0
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge DistillationCode1
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationCode1
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks0
Noisy Self-Knowledge Distillation for Text SummarizationCode1
Extending Label Smoothing Regularization with Self-Knowledge Distillation0
Self-Knowledge Distillation with Progressive Refinement of TargetsCode1
ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural NetworksCode1
Regularizing Class-wise Predictions via Self-knowledge DistillationCode1
SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK0
Revisiting Knowledge Distillation via Label Smoothing RegularizationCode0
Self-Knowledge Distillation in Natural Language Processing0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.