| Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification | Jul 21, 2024 | Data-free Knowledge DistillationImage Generation | —Unverified | 0 | 0 |
| Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data | May 6, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Mitigating Cross-client GANs-based Attack in Federated Learning | Jul 25, 2023 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 | 0 |
| Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation | Sep 21, 2022 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation | Apr 30, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging | Mar 9, 2023 | Data-free Knowledge DistillationFew-Shot Object Detection | —Unverified | 0 | 0 |
| DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning | Sep 24, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 | 0 |
| DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier | Dec 27, 2019 | Data-free Knowledge DistillationIncremental Learning | —Unverified | 0 | 0 |
| De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts | Mar 28, 2024 | Causal InferenceData-free Knowledge Distillation | —Unverified | 0 | 0 |
| Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt | May 16, 2022 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Data-Free Knowledge Transfer: A Survey | Dec 31, 2021 | Data-free Knowledge DistillationDomain Adaptation | —Unverified | 0 | 0 |
| Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis | Apr 10, 2021 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images | Oct 20, 2023 | Data AugmentationData-free Knowledge Distillation | —Unverified | 0 | 0 |
| Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion | Nov 8, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Dual Discriminator Adversarial Distillation for Data-free Model Compression | Apr 12, 2021 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation | Nov 18, 2020 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Explicit and Implicit Knowledge Distillation via Unlabeled Data | Feb 17, 2023 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification | Mar 14, 2023 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| FedD2S: Personalized Data-Free Federated Knowledge Distillation | Feb 16, 2024 | Data-free Knowledge DistillationFairness | —Unverified | 0 | 0 |
| FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks | Jan 10, 2022 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 | 0 |
| FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning | Apr 22, 2024 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 | 0 |
| Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks | Apr 1, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Generative Adversarial Simulator | Nov 23, 2020 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation | Mar 28, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Sampling to Distill: Knowledge Transfer from Open-World Data | Jul 31, 2023 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |