SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 24512500 of 4240 papers

TitleStatusHype
Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher0
Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation0
PromptDet: A Lightweight 3D Object Detection Framework with LiDAR Prompts0
Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt0
Propagate & Distill: Towards Effective Graph Learners Using Propagation-Embracing MLPs0
Prototypical Contrastive Predictive Coding0
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition0
Pseudo Knowledge Distillation: Towards Learning Optimal Instance-specific Label Smoothing Regularization0
Pseudo-label Correction for Instance-dependent Noise Using Teacher-student Framework0
Pseudo-Label Training and Model Inertia in Neural Machine Translation0
Pseudo Supervised Monocular Depth Estimation with Teacher-Student Network0
PTMs-TSCIL Pre-Trained Models Based Class-Incremental Learning0
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation0
In-Distribution Consistency Regularization Improves the Generalization of Quantization-Aware Training0
Puzzle: Distillation-Based NAS for Inference-Optimized LLMs0
QABISAR: Query-Article Bipartite Interactions for Statutory Article Retrieval0
QA-HFL: Quality-Aware Hierarchical Federated Learning for Resource-Constrained Mobile Devices with Heterogeneous Image Quality0
QCRD: Quality-guided Contrastive Rationale Distillation for Large Language Models0
QKD: Quantization-aware Knowledge Distillation0
QTI Submission to DCASE 2021: residual normalization for device-imbalanced acoustic scene classification with efficient design0
QuaLA-MiniLM: a Quantized Length Adaptive MiniLM0
Quantifying Knowledge Distillation Using Partial Information Decomposition0
Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification0
Quantized Feature Distillation for Network Quantization0
Query-Based Knowledge Sharing for Open-Vocabulary Multi-Label Classification0
Query Distillation: BERT-based Distillation for Ensemble Ranking0
Query Optimization for Parametric Knowledge Refinement in Retrieval-Augmented Large Language Models0
Quick Dense Retrievers Consume KALE: Post Training Kullback Leibler Alignment of Embeddings for Asymmetrical dual encoders0
QUILL: Query Intent with Large Language Models using Retrieval Augmentation and Multi-stage Distillation0
QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning0
Radio2Text: Streaming Speech Recognition Using mmWave Radio Signals0
RadOcc: Learning Cross-Modality Occupancy Knowledge through Rendering Assisted Distillation0
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation0
Random Conditioning for Diffusion Model Compression with Distillation0
Random Conditioning with Distillation for Data-Efficient Diffusion Model Compression0
Random Copolymer inverse design system orienting on Accurate discovering of Antimicrobial peptide-mimetic copolymers0
RangeAugment: Efficient Online Augmentation with Range Learning0
RankByGene: Gene-Guided Histopathology Representation Learning Through Cross-Modal Ranking Consistency0
RankDistil: Knowledge Distillation for Ranking0
RankDVQA-mini: Knowledge Distillation-Driven Deep Video Quality Assessment0
Ranking-aware Continual Learning for LiDAR Place Recognition0
MotherNets: Rapid Deep Ensemble Learning0
Rationalization Models for Text-to-SQL0
RAVIR: A Dataset and Methodology for the Semantic Segmentation and Quantitative Analysis of Retinal Arteries and Veins in Infrared Reflectance Imaging0
RAWtoBit: A Fully End-to-end Camera ISP Network0
RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis0
Lightweight Embedded FPGA Deployment of Learned Image Compression with Knowledge Distillation and Hybrid Quantization0
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction0
Re2G: Retrieve, Rerank, Generate0
Real-time Monocular Depth Estimation with Sparse Supervision on Mobile0
Show:102550
← PrevPage 50 of 85Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified