SOTAVerified

Self-Knowledge Distillation

Papers

Showing 125 of 68 papers

TitleStatusHype
Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable AggregationCode0
MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation0
xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation0
Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models0
Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature ExtractorCode0
Generative Dataset Distillation Based on Self-knowledge Distillation0
Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach0
Frequency-Guided Masking for Enhanced Vision Self-Supervised LearningCode0
SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillationCode0
Towards A Generalizable Pathology Foundation Model via Unified Knowledge DistillationCode2
Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression RecognitionCode1
SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots0
Self-Knowledge Distillation for Learning Ambiguity0
Guiding Frame-Level CTC Alignments Using Self-knowledge DistillationCode0
Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation LearningCode0
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
Weakly Supervised Monocular 3D Detection with a Single-View Image0
Distilled Gradual Pruning with Pruned Fine-tuningCode0
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
Deep Clustering with Diffused Sampling and Hardness-aware Self-distillationCode0
X Modality Assisting RGBT Object Tracking0
Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidanceCode0
Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification0
Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge DistillationCode0
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated LearningCode1
Show:102550
← PrevPage 1 of 3Next →

No leaderboard results yet.