SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 27012725 of 4240 papers

TitleStatusHype
A "Network Pruning Network" Approach to Deep Model Compression0
A New Method to Capturing Compositional Knowledge in Linguistic Space0
An Extra RMSNorm is All You Need for Fine Tuning to 1.58 Bits0
An Interpretable Neuron Embedding for Static Knowledge Distillation0
A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss0
A Novel Approach To Implementing Knowledge Distillation In Tsetlin Machines0
A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation0
A novel channel pruning method for deep neural network compression0
A Novel Garment Transfer Method Supervised by Distilled Knowledge of Virtual Try-on Model0
A Novel Lightweight Transformer with Edge-Aware Fusion for Remote Sensing Image Captioning0
A Novel Local-Global Feature Fusion Framework for Body-weight Exercise Recognition with Pressure Mapping Sensors0
A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition0
A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation0
An Overview of Neural Network Compression0
AntMan: Sparse Low-Rank Compression to Accelerate RNN inference0
An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition0
APALU: A Trainable, Adaptive Activation Function for Deep Learning Networks0
A Peek Into the Reasoning of Neural Networks: Interpreting with Structural Visual Concepts0
A Plasticity-Aware Method for Continual Self-Supervised Learning in Remote Sensing0
Application of Knowledge Distillation to Multi-task Speech Representation Learning0
Application of Vision-Language Model to Pedestrians Behavior and Scene Understanding in Autonomous Driving0
Applications of Knowledge Distillation in Remote Sensing: A Survey0
Applied Federated Model Personalisation in the Industrial Domain: A Comparative Study0
Apprenticeship-Inspired Elegance: Synergistic Knowledge Distillation Empowers Spiking Neural Networks for Efficient Single-Eye Emotion Recognition0
Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy0
Show:102550
← PrevPage 109 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified