SOTAVerified

Self-Knowledge Distillation

Papers

Showing 5168 of 68 papers

TitleStatusHype
SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots0
SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK0
Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images0
Self-Knowledge Distillation for Learning Ambiguity0
Self-Knowledge Distillation for Surgical Phase Recognition0
Self-Knowledge Distillation in Natural Language Processing0
Self-Knowledge Distillation via Dropout0
Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling0
Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection0
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks0
TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation0
Three Factors to Improve Out-of-Distribution Detection0
Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach0
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation0
Weakly Supervised Monocular 3D Detection with a Single-View Image0
X Modality Assisting RGBT Object Tracking0
xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation0
AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.