SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 5160 of 75 papers

TitleStatusHype
Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and FusionCode0
CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature SharingCode0
Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt0
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated LearningCode1
ZeroGen: Efficient Zero-shot Learning via Dataset GenerationCode1
FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks0
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo ReplayCode1
Conditional Generative Data-free Knowledge Distillation0
Data-Free Knowledge Transfer: A Survey0
Up to 100 Faster Data-free Knowledge DistillationCode1
Show:102550
← PrevPage 6 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified