SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 5160 of 75 papers

TitleStatusHype
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation0
NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging0
Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt0
Sampling to Distill: Knowledge Transfer from Open-World Data0
Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion0
Customizing Synthetic Data for Data-Free Student LearningCode0
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning StrategyCode0
Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and FusionCode0
Does Training with Synthetic Data Truly Protect Privacy?Code0
Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous DataCode0
Show:102550
← PrevPage 6 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified