| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 |
| Robust and Accurate Object Detection via Self-Knowledge Distillation | Nov 14, 2021 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 0 |
| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Preservation of the Global Knowledge by Not-True Distillation in Federated Learning | Jun 6, 2021 | Continual LearningFederated Learning | CodeCode Available | 1 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation | Mar 17, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation | Mar 15, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 1 |
| Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation | Feb 25, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 1 |
| Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks | Sep 30, 2020 | image-classificationImage Classification | —Unverified | 0 |