SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 16761700 of 4240 papers

TitleStatusHype
Towards Comparable Knowledge Distillation in Semantic Image Segmentation0
Leveraging ASR Pretrained Conformers for Speaker Verification through Transfer Learning and Knowledge Distillation0
Knowledge Distillation Layer that Lets the Student DecideCode0
DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation0
Rethinking Momentum Knowledge Distillation in Online Continual LearningCode1
A deep Natural Language Inference predictor without language-specific training data0
Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation0
TODM: Train Once Deploy Many Efficient Supernet-Based RNN-T Compression For On-device ASR Models0
Probabilistic Self-supervised Learning via Scoring Rules Minimization0
A survey on efficient vision transformers: algorithms, techniques, and performance benchmarking0
On the Query Strategies for Efficient Online Active Distillation0
Prior Knowledge Guided Network for Video Anomaly Detection0
COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using TransformersCode1
Knowledge Distillation from Non-streaming to Streaming ASR Encoder using Auxiliary Non-streaming Layer0
Adversarial Finetuning with Latent Representation Constraint to Mitigate Accuracy-Robustness Tradeoff0
MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image AnalysisCode0
Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts0
Exploring Multi-Modal Contextual Knowledge for Open-Vocabulary Object Detection0
SpikeBERT: A Language Spikformer Learned from BERT with Knowledge DistillationCode1
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic DataCode0
Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object DetectionCode1
Distilled GPT for Source Code SummarizationCode0
Boosting Residual Networks with Group KnowledgeCode0
DM-VTON: Distilled Mobile Real-time Virtual Try-OnCode1
Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning0
Show:102550
← PrevPage 68 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified