SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 150 of 75 papers

TitleStatusHype
Prompt2Model: Generating Deployable Models from Natural Language InstructionsCode4
Data-Free Knowledge Distillation for Deep Neural NetworksCode2
Data-Free Knowledge Distillation for Heterogeneous Federated LearningCode1
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any ArchitectureCode1
EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic DataCode1
Revisiting Data-Free Knowledge Distillation with Poisoned TeachersCode1
Relation-Guided Adversarial Learning for Data-free Knowledge TransferCode1
Contrastive Model Inversion for Data-Free Knowledge DistillationCode1
Data-Free Knowledge Distillation via Feature Exchange and Activation Region ConstraintCode1
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge DistillationCode1
Training Generative Adversarial Networks in One StageCode1
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated LearningCode1
Small Scale Data-Free Knowledge DistillationCode1
ZeroGen: Efficient Zero-shot Learning via Dataset GenerationCode1
ProGen: Progressive Zero-shot Dataset Generation via In-context FeedbackCode1
Up to 100 Faster Data-free Knowledge DistillationCode1
One-shot Federated Learning via Synthetic Distiller-Distillate CommunicationCode1
MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient EstimationCode1
Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual LearningCode1
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?Code1
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo ReplayCode1
Generative Adversarial Simulator0
GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation0
HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast0
Knowledge Migration Framework for Smart Contract Vulnerability Detection0
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation0
Completely Heterogeneous Federated Learning0
Conditional Generative Data-free Knowledge Distillation0
Dense Depth Distillation with Out-of-Distribution Simulated Images0
Data-Free Distillation of Language Model by Text-to-Text Transfer0
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images0
Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis0
Data-Free Knowledge Transfer: A Survey0
De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts0
DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier0
DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning0
Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification0
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images0
Dual Discriminator Adversarial Distillation for Data-free Model Compression0
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation0
Explicit and Implicit Knowledge Distillation via Unlabeled Data0
Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification0
FedD2S: Personalized Data-Free Federated Knowledge Distillation0
FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks0
FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning0
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks0
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation0
Leveraging Foundation Models for Efficient Federated Learning in Resource-restricted Edge Networks0
Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data0
Mitigating Cross-client GANs-based Attack in Federated Learning0
Show:102550
← PrevPage 1 of 2Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified