SOTAVerified

Knowledge Distillation

Knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized.

Papers

Showing 351400 of 4240 papers

TitleStatusHype
Distilling Knowledge for Designing Computational Imaging SystemsCode0
A Contrastive Teacher-Student Framework for Novelty Detection under Style Shifts0
Efficient Knowledge Distillation of SAM for Medical Image Segmentation0
Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning0
TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models0
FedEFM: Federated Endovascular Foundation Model with Unseen Data0
Target-driven Self-Distillation for Partial Observed Trajectories Forecasting0
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep DeploymentCode0
Return of the Encoder: Maximizing Parameter Efficiency for SLMsCode1
PISCO: Pretty Simple Compression for Retrieval-Augmented Generation0
Scaling Large Vision-Language Models for Enhanced Multimodal Comprehension In Biomedical Image Analysis0
MimicGait: A Model Agnostic approach for Occluded Gait Recognition using Correlational Knowledge DistillationCode0
On Accelerating Edge AI: Optimizing Resource-Constrained Environments0
Pre-trained Model Guided Mixture Knowledge Distillation for Adversarial Federated Learning0
Graph-Based Cross-Domain Knowledge Distillation for Cross-Dataset Text-to-Image Person Retrieval0
Remining Hard Negatives for Generative Pseudo Labeled Domain Adaptation0
Multimodal Prescriptive Deep Learning0
Multi-aspect Knowledge Distillation with Large Language ModelCode0
Unlearning Clients, Features and Samples in Vertical Federated Learning0
LiT: Delving into a Simplified Linear Diffusion Transformer for Image Generation0
EchoLM: Accelerating LLM Serving with Real-time Knowledge Distillation0
Toward Model-centric Heterogeneous Federated Graph Learning: A Knowledge-driven Approach0
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation0
Efficient Lung Ultrasound Severity Scoring Using Dedicated Feature ExtractorCode0
Learning to reconstruct signals with inexact sensing operator via knowledge distillation0
DNA 1.0 Technical Report0
Enhancing Generalization in Chain of Thought Reasoning for Smaller Models0
Knowledge Distillation for Image Restoration : Simultaneous Learning from Degraded and Clean Images0
Soft Knowledge Distillation with Multi-Dimensional Cross-Net Attention for Image Restoration Models Compression0
Class Incremental Fault Diagnosis under Limited Fault Data via Supervised Contrastive Knowledge DistillationCode0
Induced Model Matching: Restricted Models Help Train Full-Featured ModelsCode0
Efficient Traffic Prediction Through Spatio-Temporal DistillationCode1
Towards Fast, Specialized Machine Learning Force Fields: Distilling Foundation Models via Energy HessiansCode1
Feature-based One-For-All: A Universal Framework for Heterogeneous Knowledge Distillation0
VECT-GAN: A variationally encoded generative model for overcoming data scarcity in pharmaceutical scienceCode0
Balance Divergence for Knowledge Distillation0
Self-Attentive Spatio-Temporal Calibration for Precise Intermediate Layer Matching in ANN-to-SNN DistillationCode0
Knowledge Distillation and Enhanced Subdomain Adaptation Using Graph Convolutional Network for Resource-Constrained Bearing Fault Diagnosis0
Research on the Online Update Method for Retrieval-Augmented Generation (RAG) Model with Incremental Learning0
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective0
Dual Scale-aware Adaptive Masked Knowledge Distillation for Object Detection0
Application of Vision-Language Model to Pedestrians Behavior and Scene Understanding in Autonomous Driving0
Overcoming Language Priors for Visual Question Answering Based on Knowledge Distillation0
From My View to Yours: Ego-Augmented Learning in Large Vision Language Models for Understanding Exocentric Daily Living ActivitiesCode1
LLMQuoter: Enhancing RAG Capabilities Through Efficient Quote Extraction From Large ContextsCode0
Generative Dataset Distillation Based on Self-knowledge Distillation0
Enhancing Scene Classification in Cloudy Image Scenarios: A Collaborative Transfer Method with Information Regulation Mechanism using Optical Cloud-Covered and SAR Remote Sensing ImagesCode0
Federated Fine-Tuning of LLMs: Framework Comparison and Research Directions0
FedKD-hybrid: Federated Hybrid Knowledge Distillation for Lithography Hotspot DetectionCode0
ConcealGS: Concealing Invisible Copyright Information in 3D Gaussian SplattingCode1
Show:102550
← PrevPage 8 of 85Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ScaleKD (T:BEiT-L S:ViT-B/14)Top-1 accuracy %86.43Unverified
2ScaleKD (T:Swin-L S:ViT-B/16)Top-1 accuracy %85.53Unverified
3ScaleKD (T:Swin-L S:ViT-S/16)Top-1 accuracy %83.93Unverified
4ScaleKD (T:Swin-L S:Swin-T)Top-1 accuracy %83.8Unverified
5KD++(T: regnety-16GF S:ViT-B)Top-1 accuracy %83.6Unverified
6VkD (T:RegNety 160 S:DeiT-S)Top-1 accuracy %82.9Unverified
7SpectralKD (T:Swin-S S:Swin-T)Top-1 accuracy %82.7Unverified
8ScaleKD (T:Swin-L S:ResNet-50)Top-1 accuracy %82.55Unverified
9DiffKD (T:Swin-L S: Swin-T)Top-1 accuracy %82.5Unverified
10DIST (T: Swin-L S: Swin-T)Top-1 accuracy %82.3Unverified
#ModelMetricClaimedVerifiedStatus
1SRD (T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)79.86Unverified
2shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)78.76Unverified
3MV-MR (T: CLIP/ViT-B-16 S: resnet50)Top-1 Accuracy (%)78.6Unverified
4resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)78.28Unverified
5resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])Top-1 Accuracy (%)78.08Unverified
6ReviewKD++(T:resnet-32x4, S:shufflenet-v2)Top-1 Accuracy (%)77.93Unverified
7ReviewKD++(T:resnet-32x4, S:shufflenet-v1)Top-1 Accuracy (%)77.68Unverified
8resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)77.5Unverified
9resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.68Unverified
10resnet8x4 (T: resnet32x4 S: resnet8x4)Top-1 Accuracy (%)76.31Unverified
#ModelMetricClaimedVerifiedStatus
1LSHFM (T: ResNet101 S: ResNet50)mAP93.17Unverified
2LSHFM (T: ResNet101 S: MobileNetV2)mAP90.14Unverified
#ModelMetricClaimedVerifiedStatus
1TIE-KD (T: Adabins S: MobileNetV2)RMSE2.43Unverified