| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 |
| Robust and Accurate Object Detection via Self-Knowledge Distillation | Nov 14, 2021 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 0 |
| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation | Mar 17, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation | Mar 15, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 1 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks | Sep 30, 2020 | image-classificationImage Classification | —Unverified | 0 |
| Noisy Self-Knowledge Distillation for Text Summarization | Sep 15, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation with Progressive Refinement of Targets | Jun 22, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks | May 7, 2020 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Regularizing Class-wise Predictions via Self-knowledge Distillation | Mar 31, 2020 | image-classificationImage Classification | CodeCode Available | 1 |
| SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK | Sep 25, 2019 | Adversarial AttackKnowledge Distillation | —Unverified | 0 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Self-Knowledge Distillation in Natural Language Processing | Aug 2, 2019 | Deep LearningKnowledge Distillation | —Unverified | 0 |