SOTAVerified

Self-Knowledge Distillation

Papers

Showing 3140 of 68 papers

TitleStatusHype
Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation LearningCode0
Weakly Supervised Monocular 3D Detection with a Single-View Image0
Distilled Gradual Pruning with Pruned Fine-tuningCode0
Deep Clustering with Diffused Sampling and Hardness-aware Self-distillationCode0
X Modality Assisting RGBT Object Tracking0
Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidanceCode0
Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification0
Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge DistillationCode0
Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the WildCode0
Three Factors to Improve Out-of-Distribution Detection0
Show:102550
← PrevPage 4 of 7Next →

No leaderboard results yet.