SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 6170 of 75 papers

TitleStatusHype
Towards Data-Free Domain GeneralizationCode0
Data-Free Knowledge Distillation for Image Super-ResolutionCode0
Data-Free Knowledge Distillation for Heterogeneous Federated LearningCode1
Contrastive Model Inversion for Data-Free Knowledge DistillationCode1
Dual Discriminator Adversarial Distillation for Data-free Model Compression0
Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis0
Training Generative Adversarial Networks in One StageCode1
Generative Adversarial Simulator0
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation0
Robustness and Diversity Seeking Data-Free Knowledge DistillationCode0
Show:102550
← PrevPage 7 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified