SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 39013925 of 4240 papers

TitleStatusHype
G^2D: Boosting Multimodal Learning with Gradient-Guided DistillationCode0
Revisiting Knowledge Distillation for Autoregressive Language ModelsCode0
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient DistillationCode0
Revisiting Knowledge Distillation under Distribution ShiftCode0
Distillation-based fabric anomaly detectionCode0
Multiple Teachers-Meticulous Student: A Domain Adaptive Meta-Knowledge Distillation Model for Medical Image ClassificationCode0
F-VLM: Open-Vocabulary Object Detection upon Frozen Vision and Language ModelsCode0
Knowledge-guided Causal Intervention for Weakly-supervised Object LocalizationCode0
Structured Knowledge Distillation for Dense PredictionCode0
Distill2Vec: Dynamic Graph Representation Learning with Knowledge DistillationCode0
Multi-source-free Domain Adaptation via Uncertainty-aware Adaptive DistillationCode0
Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge DistillationCode0
Multistage Collaborative Knowledge Distillation from a Large Language Model for Semi-Supervised Sequence GenerationCode0
Structured Knowledge Distillation for Semantic SegmentationCode0
Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity MatchingCode0
Revisiting Knowledge Distillation via Label Smoothing RegularizationCode0
Towards Class-wise Fair Adversarial Training via Anti-Bias Soft Label DistillationCode0
WaterMono: Teacher-Guided Anomaly Masking and Enhancement Boosting for Robust Underwater Self-Supervised Monocular Depth EstimationCode0
Disentangling spatio-temporal knowledge for weakly supervised object detection and segmentation in surgical videoCode0
FS-BAN: Born-Again Networks for Domain Generalization Few-Shot ClassificationCode0
From underwater to aerial: a novel multi-scale knowledge distillation approach for coral reef monitoringCode0
Preference-Consistent Knowledge Distillation for Recommender SystemCode0
CKD: Contrastive Knowledge Distillation from A Sample-wise PerspectiveCode0
Multi-Teacher Knowledge Distillation For Text Image Machine TranslationCode0
Multi Teacher Privileged Knowledge Distillation for Multimodal Expression RecognitionCode0
Show:102550
← PrevPage 157 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified