| Prompt2Model: Generating Deployable Models from Natural Language Instructions | Aug 23, 2023 | Data-free Knowledge DistillationDataset Generation | CodeCode Available | 4 |
| Data-Free Knowledge Distillation for Deep Neural Networks | Oct 19, 2017 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 2 |
| Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay | Jan 9, 2022 | Data-free Knowledge Distillationimage-classification | CodeCode Available | 1 |
| Training Generative Adversarial Networks in One Stage | Feb 28, 2021 | Data-free Knowledge DistillationImage Generation | CodeCode Available | 1 |
| Up to 100 Faster Data-free Knowledge Distillation | Dec 12, 2021 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 1 |
| Contrastive Model Inversion for Data-Free Knowledge Distillation | May 18, 2021 | Contrastive LearningData-free Knowledge Distillation | CodeCode Available | 1 |
| One-shot Federated Learning via Synthetic Distiller-Distillate Communication | Dec 6, 2024 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 1 |
| Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning | Mar 17, 2022 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 1 |
| Relation-Guided Adversarial Learning for Data-free Knowledge Transfer | Dec 16, 2024 | Data-free Knowledge DistillationData Free Quantization | CodeCode Available | 1 |
| Small Scale Data-Free Knowledge Distillation | Jun 12, 2024 | Data-free Knowledge DistillationGenerative Adversarial Network | CodeCode Available | 1 |
| ZeroGen: Efficient Zero-shot Learning via Dataset Generation | Feb 16, 2022 | Data-free Knowledge DistillationDataset Generation | CodeCode Available | 1 |
| Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual Learning | Sep 2, 2024 | Continual LearningContrastive Learning | CodeCode Available | 1 |
| Data-Free Knowledge Distillation for Heterogeneous Federated Learning | May 20, 2021 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 1 |
| MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient Estimation | May 6, 2020 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 1 |
| ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback | Oct 22, 2022 | Data-free Knowledge DistillationDataset Generation | CodeCode Available | 1 |
| Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation? | May 22, 2023 | Data-free Knowledge DistillationFew-Shot Learning | CodeCode Available | 1 |
| DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture | Sep 5, 2024 | Data-free Knowledge DistillationDenoising | CodeCode Available | 1 |
| Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint | Jan 1, 2023 | Data AugmentationData-free Knowledge Distillation | CodeCode Available | 1 |
| EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic Data | Sep 11, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 1 |
| NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation | Sep 30, 2023 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 1 |
| Revisiting Data-Free Knowledge Distillation with Poisoned Teachers | Jun 4, 2023 | Backdoor Defense for Data-Free Distillation with Poisoned TeachersData-free Knowledge Distillation | CodeCode Available | 1 |
| Conditional Generative Data-free Knowledge Distillation | Dec 31, 2021 | Conditional Image GenerationData-free Knowledge Distillation | —Unverified | 0 |
| DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning | Sep 24, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 |
| Data-Free Distillation of Language Model by Text-to-Text Transfer | Nov 3, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 |
| CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation | Apr 30, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |