SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 101125 of 4240 papers

TitleStatusHype
Why Knowledge Distillation Works in Generative Models: A Minimal Working Explanation0
Uniformity First: Uniformity-aware Test-time Adaptation of Vision-language Models against Image CorruptionCode0
LAMeTA: Intent-Aware Agentic Network Optimization via a Large AI Model-Empowered Two-Stage Approach0
Always Clear Depth: Robust Monocular Depth Estimation under Adverse WeatherCode1
SSR: Enhancing Depth Perception in Vision-Language Models via Rationale-Guided Spatial Reasoning0
On Membership Inference Attacks in Knowledge DistillationCode0
Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning0
FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer0
Semantically-Aware Game Image Quality Assessment0
Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors0
Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge DistillationCode0
Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging0
DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images0
Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer0
MoKD: Multi-Task Optimization for Knowledge Distillation0
Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage0
Foundation Models Knowledge Distillation For Battery Capacity Degradation ForecastCode1
An Extra RMSNorm is All You Need for Fine Tuning to 1.58 Bits0
KDH-MLTC: Knowledge Distillation for Healthcare Multi-Label Text Classification0
Channel Fingerprint Construction for Massive MIMO: A Deep Conditional Generative Approach0
Topology-Guided Knowledge Distillation for Efficient Point Cloud ProcessingCode0
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via Dual-Head OptimizationCode0
Ranking-aware Continual Learning for LiDAR Place Recognition0
Structural Entropy Guided Agent for Detecting and Repairing Knowledge Deficiencies in LLMsCode2
Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models0
Show:102550
← PrevPage 5 of 170Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified