SOTAVerified

Self-Knowledge Distillation

Papers

Showing 4150 of 68 papers

TitleStatusHype
Incorporating Graph Information in Transformer-based AMR ParsingCode0
Self-Knowledge Distillation for Surgical Phase Recognition0
Lightweight Self-Knowledge Distillation with Multi-source Information FusionCode0
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft LabelsCode0
Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation0
You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement0
Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling0
AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation0
TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation0
A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition0
Show:102550
← PrevPage 5 of 7Next →

No leaderboard results yet.