SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 125 of 75 papers

TitleStatusHype
Prompt2Model: Generating Deployable Models from Natural Language InstructionsCode4
Data-Free Knowledge Distillation for Deep Neural NetworksCode2
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated LearningCode1
Up to 100 Faster Data-free Knowledge DistillationCode1
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?Code1
Contrastive Model Inversion for Data-Free Knowledge DistillationCode1
Relation-Guided Adversarial Learning for Data-free Knowledge TransferCode1
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any ArchitectureCode1
Small Scale Data-Free Knowledge DistillationCode1
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge DistillationCode1
MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient EstimationCode1
Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual LearningCode1
Data-Free Knowledge Distillation for Heterogeneous Federated LearningCode1
ProGen: Progressive Zero-shot Dataset Generation via In-context FeedbackCode1
Revisiting Data-Free Knowledge Distillation with Poisoned TeachersCode1
One-shot Federated Learning via Synthetic Distiller-Distillate CommunicationCode1
ZeroGen: Efficient Zero-shot Learning via Dataset GenerationCode1
Data-Free Knowledge Distillation via Feature Exchange and Activation Region ConstraintCode1
EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic DataCode1
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo ReplayCode1
Training Generative Adversarial Networks in One StageCode1
Data-free Knowledge Distillation for Segmentation using Data-Enriching GANCode0
Knowledge Extraction with No Observable DataCode0
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data GenerationCode0
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge DistillationCode0
Show:102550
← PrevPage 1 of 3Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified