| Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models | Feb 27, 2025 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding | Jun 7, 2022 | Graph EmbeddingKnowledge Distillation | —Unverified | 0 |
| Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation | Mar 18, 2023 | Autonomous DrivingDomain Adaptation | —Unverified | 0 |
| SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots | Jun 20, 2024 | In-Context LearningKnowledge Distillation | —Unverified | 0 |
| Generative Dataset Distillation Based on Self-knowledge Distillation | Jan 8, 2025 | Dataset DistillationKnowledge Distillation | —Unverified | 0 |
| Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition | Feb 4, 2022 | ClassificationKnowledge Distillation | —Unverified | 0 |
| Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification | Nov 26, 2023 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition | Sep 3, 2022 | Action RecognitionKnowledge Distillation | —Unverified | 0 |
| AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation | Nov 20, 2022 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |
| Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack | May 3, 2021 | Knowledge DistillationSelf-Knowledge Distillation | —Unverified | 0 |