| Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification | Jul 21, 2024 | Data-free Knowledge DistillationImage Generation | —Unverified | 0 | 0 |
| Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data | May 6, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| Mitigating Cross-client GANs-based Attack in Federated Learning | Jul 25, 2023 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 | 0 |
| Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation | Sep 21, 2022 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation | Apr 30, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |
| NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging | Mar 9, 2023 | Data-free Knowledge DistillationFew-Shot Object Detection | —Unverified | 0 | 0 |
| DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning | Sep 24, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 | 0 |
| DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier | Dec 27, 2019 | Data-free Knowledge DistillationIncremental Learning | —Unverified | 0 | 0 |
| De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts | Mar 28, 2024 | Causal InferenceData-free Knowledge Distillation | —Unverified | 0 | 0 |
| Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt | May 16, 2022 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 | 0 |