SOTAVerified

Self-Knowledge Distillation

Papers

Showing 150 of 68 papers

TitleStatusHype
Tackling Data Heterogeneity in Federated Learning through Knowledge Distillation with Inequitable AggregationCode0
MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation0
xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation0
Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models0
Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature ExtractorCode0
Generative Dataset Distillation Based on Self-knowledge Distillation0
Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach0
Frequency-Guided Masking for Enhanced Vision Self-Supervised LearningCode0
SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillationCode0
Towards A Generalizable Pathology Foundation Model via Unified Knowledge DistillationCode2
Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression RecognitionCode1
SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots0
Self-Knowledge Distillation for Learning Ambiguity0
Guiding Frame-Level CTC Alignments Using Self-knowledge DistillationCode0
Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation LearningCode0
CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge DistillationCode1
Weakly Supervised Monocular 3D Detection with a Single-View Image0
Distilled Gradual Pruning with Pruned Fine-tuningCode0
BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge DistillationCode1
Deep Clustering with Diffused Sampling and Hardness-aware Self-distillationCode0
X Modality Assisting RGBT Object Tracking0
Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidanceCode0
Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification0
Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge DistillationCode0
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated LearningCode1
Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the WildCode0
Three Factors to Improve Out-of-Distribution Detection0
Effective Whole-body Pose Estimation with Two-stages DistillationCode4
Robust Spatiotemporal Traffic Forecasting with Reinforced Dynamic Adversarial TrainingCode1
Incorporating Graph Information in Transformer-based AMR ParsingCode0
Self-Knowledge Distillation for Surgical Phase Recognition0
Lightweight Self-Knowledge Distillation with Multi-source Information FusionCode0
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft LabelsCode0
Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation0
DualFair: Fair Representation Learning at Both Group and Individual Levels via Contrastive Self-supervisionCode1
Graph-based Knowledge Distillation: A survey and experimental evaluationCode1
You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement0
Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling0
AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation0
Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised LearningCode1
TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation0
A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition0
RAIN: RegulArization on Input and Network for Black-Box Domain AdaptationCode0
Self-Knowledge Distillation via Dropout0
MixSKD: Self-Knowledge Distillation from Mixup for Image RecognitionCode1
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding0
Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images0
Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection0
Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation0
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition0
Show:102550
← PrevPage 1 of 2Next →

No leaderboard results yet.