| RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation | Aug 22, 2022 | Data AugmentationDomain Adaptation | CodeCode Available | 0 |
| Self-Knowledge Distillation via Dropout | Aug 11, 2022 | Adversarial Robustnessimage-classification | —Unverified | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 |
| Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images | Jun 7, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection | Apr 14, 2022 | Knowledge DistillationMultiple Instance Learning | —Unverified | 0 |
| Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation | Mar 10, 2022 | DecoderKnowledge Distillation | —Unverified | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 |
| Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training | Dec 13, 2021 | Data AugmentationKnowledge Distillation | CodeCode Available | 0 |
| Extracting knowledge from features with multilevel abstraction | Dec 4, 2021 | Data AugmentationKnowledge Distillation | —Unverified | 0 |
| Robust and Accurate Object Detection via Self-Knowledge Distillation | Nov 14, 2021 | Adversarial RobustnessKnowledge Distillation | CodeCode Available | 0 |
| Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation | Oct 1, 2021 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation | Mar 17, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks | Sep 30, 2020 | image-classificationImage Classification | —Unverified | 0 |
| Extending Label Smoothing Regularization with Self-Knowledge Distillation | Sep 11, 2020 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK | Sep 25, 2019 | Adversarial AttackKnowledge Distillation | —Unverified | 0 |
| Revisiting Knowledge Distillation via Label Smoothing Regularization | Sep 25, 2019 | Knowledge DistillationSelf-Knowledge Distillation | CodeCode Available | 0 |
| Self-Knowledge Distillation in Natural Language Processing | Aug 2, 2019 | Deep LearningKnowledge Distillation | —Unverified | 0 |