SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 39263950 of 4240 papers

TitleStatusHype
Multi-to-Single Knowledge Distillation for Point Cloud Semantic SegmentationCode0
Right Time to Learn:Promoting Generalization via Bio-inspired Spacing Effect in Knowledge DistillationCode0
Chemical transformer compression for accelerating both training and inference of molecular modelingCode0
Frequency-Guided Masking for Enhanced Vision Self-Supervised LearningCode0
An Efficient End-to-End Approach to Noise Invariant Speech Features via Multi-Task LearningCode0
Frameless Graph Knowledge DistillationCode0
Discourse Structures Guided Fine-grained Propaganda IdentificationCode0
MiniDisc: Minimal Distillation Schedule for Language Model CompressionCode0
FractalAD: A simple industrial anomaly detection method using fractal anomaly generation and backbone knowledge distillationCode0
Student Becomes Decathlon Master in Retinal Vessel Segmentation via Dual-teacher Multi-target Domain AdaptationCode0
Robust and Accurate Object Detection via Self-Knowledge DistillationCode0
Mutli-View 3D Reconstruction using Knowledge DistillationCode0
Exploring Generalizable Distillation for Efficient Medical Image SegmentationCode0
DiSCo: LLM Knowledge Distillation for Efficient Sparse Retrieval in Conversational SearchCode0
Digital Staining with Knowledge Distillation: A Unified Framework for Unpaired and Paired-But-Misaligned DataCode0
Analyzing the Confidentiality of Undistillable Teachers in Knowledge DistillationCode0
Mutual-Learning Knowledge Distillation for Nighttime UAV TrackingCode0
A Unified Object Counting Network with Object Occupation PriorCode0
Student Helping Teacher: Teacher Evolution via Self-Knowledge DistillationCode0
Towards Data-Free Domain GeneralizationCode0
MV-MR: multi-views and multi-representations for self-supervised learning and knowledge distillationCode0
Robust Knowledge Distillation Based on Feature Variance Against Backdoored Teacher ModelCode0
Accelerated Proton Resonance Frequency-based Magnetic Resonance Thermometry by Optimized Deep Learning MethodCode0
Foundation Models for Structural Health MonitoringCode0
Natural Language Generation for Effective Knowledge DistillationCode0
Show:102550
← PrevPage 158 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified