| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation | Mar 17, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks | Sep 30, 2020 | image-classificationImage Classification | —Unverified | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK | Sep 25, 2019 | Adversarial AttackKnowledge Distillation | —Unverified | 0 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Self-Knowledge Distillation in Natural Language Processing | Aug 2, 2019 | Deep LearningKnowledge Distillation | —Unverified | 0 |