SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 19261950 of 4240 papers

TitleStatusHype
D^2TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal SummarizationCode0
Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study0
Understanding the Effect of Data Augmentation on Knowledge Distillation0
DualVC: Dual-mode Voice Conversion using Intra-model Knowledge Distillation and Hybrid Predictive Coding0
One-Shot Federated Learning for LEO Constellations that Reduces Convergence Time from Days to 90 Minutes0
DisCo: Distilled Student Models Co-training for Semi-supervised Text MiningCode1
Lifting the Curse of Capacity Gap in Distilling Language ModelsCode1
Accurate Knowledge Distillation with n-best Reranking0
Sentence Embedder Guided Utterance Encoder (SEGUE) for Spoken Language UnderstandingCode0
Pseudo-Label Training and Model Inertia in Neural Machine Translation0
Boost Vision Transformer with GPU-Friendly Sparsity and Quantization0
BERM: Training the Balanced and Extractable Representation for Matching to Improve Generalization Ability of Dense Retrieval0
Cross-modality Data Augmentation for End-to-End Sign Language TranslationCode1
DQ-Whisper: Joint Distillation and Quantization for Efficient Multilingual Speech Recognition0
Student-friendly Knowledge Distillation0
Catch-Up Distillation: You Only Need to Train Once for Accelerating SamplingCode0
AD-KD: Attribution-Driven Knowledge Distillation for Language Model CompressionCode1
When Gradient Descent Meets Derivative-Free Optimization: A Match Made in Black-Box Scenario0
Tailoring Instructions to Student's Learning Levels Boosts Knowledge DistillationCode1
Weight-Inherited Distillation for Task-Agnostic BERT CompressionCode0
Lightweight Self-Knowledge Distillation with Multi-source Information FusionCode0
Bridging the Domain Gap: Self-Supervised 3D Scene Understanding with Foundation ModelsCode1
Soft Prompt Decoding for Multilingual Dense Retrieval0
Distilling Knowledge for Short-to-Long Term Trajectory Prediction0
Improving Defensive Distillation using Teacher Assistant0
Show:102550
← PrevPage 78 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified