SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 26262650 of 4240 papers

TitleStatusHype
ADU-Depth: Attention-based Distillation with Uncertainty Modeling for Depth Estimation0
Advancing Deep Learning through Probability Engineering: A Pragmatic Paradigm for Modern AI0
Advancing Medical Radiograph Representation Learning: A Hybrid Pre-training Paradigm with Multilevel Semantic Granularity0
Adversarial-Based Knowledge Distillation for Multi-Model Ensemble and Noisy Data Refinement0
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks0
Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning0
Adversarial Finetuning with Latent Representation Constraint to Mitigate Accuracy-Robustness Tradeoff0
Adversarially Robust and Explainable Model Compression with On-Device Personalization for Text Classification0
Adversarial Prompt Distillation for Vision-Language Models0
Adversarial Robustness of Distilled and Pruned Deep Learning-based Wireless Classifiers0
Adversarial Self-Supervised Data-Free Distillation for Text Classification0
Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples0
Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation0
A dynamic interactive learning framework for automated 3D medical image segmentation0
A Flexible Multi-Task Model for BERT Serving0
Discovery of novel antimicrobial peptides with notable antibacterial potency by a LLM-based foundation model0
A Framework for Double-Blind Federated Adaptation of Foundation Models0
AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages0
After-Stroke Arm Paresis Detection using Kinematic Data0
A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone0
Generalized Supervised Contrastive Learning0
A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks0
A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy0
AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes0
Agglomerating Large Vision Encoders via Distillation for VFSS Segmentation0
Show:102550
← PrevPage 106 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified