SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 19511975 of 4240 papers

TitleStatusHype
ComKD-CLIP: Comprehensive Knowledge Distillation for Contrastive Language-Image Pre-traning Model0
Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation0
Improving Mathematical Reasoning Capabilities of Small Language Models via Feedback-Driven Distillation0
Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding0
CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation0
A Survey on Model Compression for Large Language Models0
Improving Neural Machine Translation by Denoising Training0
Improving Neural ODEs via Knowledge Distillation0
KAT-V1: Kwai-AutoThink Technical Report0
Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation0
KD^2M: An unifying framework for feature knowledge distillation0
Improving Pronunciation and Accent Conversion through Knowledge Distillation And Synthetic Ground-Truth from Native TTS0
Efficient Machine Translation with Model Pruning and Quantization0
Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction0
Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation0
Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation0
Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks0
ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression0
KDC-MAE: Knowledge Distilled Contrastive Mask Auto-Encoder0
Efficient Knowledge Distillation via Curriculum Extraction0
Efficient Knowledge Distillation of SAM for Medical Image Segmentation0
Improving Task-Agnostic BERT Distillation with Layer Mapping Search0
Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation0
Improving the Interpretability of Deep Neural Networks with Knowledge Distillation0
Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights0
Show:102550
← PrevPage 79 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified