SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 2650 of 75 papers

TitleStatusHype
FedD2S: Personalized Data-Free Federated Knowledge Distillation0
Data-Free Distillation of Language Model by Text-to-Text Transfer0
Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous DataCode0
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images0
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge DistillationCode1
DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning0
DAD++: Improved Data-free Test Time Adversarial DefenseCode0
Prompt2Model: Generating Deployable Models from Natural Language InstructionsCode4
Sampling to Distill: Knowledge Transfer from Open-World Data0
Mitigating Cross-client GANs-based Attack in Federated Learning0
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images0
Customizing Synthetic Data for Data-Free Student LearningCode0
Revisiting Data-Free Knowledge Distillation with Poisoned TeachersCode1
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?Code1
Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification0
NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging0
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation0
Explicit and Implicit Knowledge Distillation via Unlabeled Data0
Synthetic data generation method for data-free knowledge distillation in regression neural networksCode0
Data-Free Knowledge Distillation via Feature Exchange and Activation Region ConstraintCode1
Completely Heterogeneous Federated Learning0
ProGen: Progressive Zero-shot Dataset Generation via In-context FeedbackCode1
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation0
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning StrategyCode0
Dense Depth Distillation with Out-of-Distribution Simulated Images0
Show:102550
← PrevPage 2 of 3Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified