SOTAVerified

Self-Knowledge Distillation

Papers

Showing 1120 of 68 papers

TitleStatusHype
Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression RecognitionCode1
SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots0
Self-Knowledge Distillation for Learning Ambiguity0
Guiding Frame-Level CTC Alignments Using Self-knowledge DistillationCode0
Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation LearningCode0
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
Weakly Supervised Monocular 3D Detection with a Single-View Image0
Distilled Gradual Pruning with Pruned Fine-tuningCode0
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
Deep Clustering with Diffused Sampling and Hardness-aware Self-distillationCode0
Show:102550
← PrevPage 2 of 7Next →

No leaderboard results yet.