SOTAVerified

Self-Knowledge Distillation

Papers

Showing 2130 of 68 papers

TitleStatusHype
Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models0
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding0
Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation0
SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots0
Generative Dataset Distillation Based on Self-knowledge Distillation0
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition0
Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification0
A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition0
AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation0
Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack0
Show:102550
← PrevPage 3 of 7Next →

No leaderboard results yet.