SOTAVerified

Self-Knowledge Distillation

Papers

Showing 5168 of 68 papers

TitleStatusHype
RAIN: RegulArization on Input and Network for Black-Box Domain AdaptationCode0
Self-Knowledge Distillation via Dropout0
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding0
Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images0
Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection0
Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation0
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition0
Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-trainingCode0
Extracting knowledge from features with multilevel abstraction0
Robust and Accurate Object Detection via Self-Knowledge DistillationCode0
Student Helping Teacher: Teacher Evolution via Self-Knowledge DistillationCode0
Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack0
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation0
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks0
Extending Label Smoothing Regularization with Self-Knowledge Distillation0
SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK0
Revisiting Knowledge Distillation via Label Smoothing RegularizationCode0
Self-Knowledge Distillation in Natural Language Processing0
Show:102550
← PrevPage 3 of 3Next →

No leaderboard results yet.