SOTAVerified

Self-Knowledge Distillation

Papers

Showing 5168 of 68 papers

TitleStatusHype
Lightweight Self-Knowledge Distillation with Multi-source Information FusionCode0
Incorporating Graph Information in Transformer-based AMR ParsingCode0
Guiding Frame-Level CTC Alignments Using Self-knowledge DistillationCode0
Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge DistillationCode0
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft LabelsCode0
Frequency-Guided Masking for Enhanced Vision Self-Supervised LearningCode0
Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the WildCode0
Revisiting Knowledge Distillation via Label Smoothing RegularizationCode0
Robust and Accurate Object Detection via Self-Knowledge DistillationCode0
Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature ExtractorCode0
SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillationCode0
Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-trainingCode0
Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidanceCode0
Distilled Gradual Pruning with Pruned Fine-tuningCode0
Deep Clustering with Diffused Sampling and Hardness-aware Self-distillationCode0
Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation LearningCode0
RAIN: RegulArization on Input and Network for Black-Box Domain AdaptationCode0
Student Helping Teacher: Teacher Evolution via Self-Knowledge DistillationCode0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.