SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 6170 of 75 papers

TitleStatusHype
Data-Free Knowledge Transfer: A Survey0
Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis0
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images0
Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion0
Dual Discriminator Adversarial Distillation for Data-free Model Compression0
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation0
Explicit and Implicit Knowledge Distillation via Unlabeled Data0
Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification0
FedD2S: Personalized Data-Free Federated Knowledge Distillation0
FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks0
Show:102550
← PrevPage 7 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified