| Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments | May 26, 2025 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 0 |
| CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation | Apr 30, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks | Apr 1, 2025 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast | Mar 9, 2025 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| Does Training with Synthetic Data Truly Protect Privacy? | Feb 18, 2025 | Data-free Knowledge DistillationDataset Distillation | CodeCode Available | 0 |
| Hybrid Data-Free Knowledge Distillation | Dec 18, 2024 | Data-free Knowledge DistillationGenerative Adversarial Network | CodeCode Available | 0 |
| Relation-Guided Adversarial Learning for Data-free Knowledge Transfer | Dec 16, 2024 | Data-free Knowledge DistillationData Free Quantization | CodeCode Available | 1 |
| Knowledge Migration Framework for Smart Contract Vulnerability Detection | Dec 15, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| One-shot Federated Learning via Synthetic Distiller-Distillate Communication | Dec 6, 2024 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 1 |
| Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation | Nov 26, 2024 | Data-free Knowledge DistillationDiversity | CodeCode Available | 0 |
| Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion | Nov 8, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion Augmentation | Oct 23, 2024 | Data-free Knowledge DistillationDiversity | CodeCode Available | 0 |
| Leveraging Foundation Models for Efficient Federated Learning in Resource-restricted Edge Networks | Sep 14, 2024 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic Data | Sep 11, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 1 |
| DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture | Sep 5, 2024 | Data-free Knowledge DistillationDenoising | CodeCode Available | 1 |
| Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual Learning | Sep 2, 2024 | Continual LearningContrastive Learning | CodeCode Available | 1 |
| Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification | Jul 21, 2024 | Data-free Knowledge DistillationImage Generation | —Unverified | 0 |
| Small Scale Data-Free Knowledge Distillation | Jun 12, 2024 | Data-free Knowledge DistillationGenerative Adversarial Network | CodeCode Available | 1 |
| Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data | May 6, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning | Apr 22, 2024 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| Data-free Knowledge Distillation for Fine-grained Visual Categorization | Apr 18, 2024 | Data-free Knowledge DistillationFine-Grained Visual Categorization | CodeCode Available | 0 |
| GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation | Mar 28, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts | Mar 28, 2024 | Causal InferenceData-free Knowledge Distillation | —Unverified | 0 |
| AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation | Mar 11, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |
| Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation | Feb 18, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |