| FedD2S: Personalized Data-Free Federated Knowledge Distillation | Feb 16, 2024 | Data-free Knowledge DistillationFairness | —Unverified | 0 |
| Data-Free Distillation of Language Model by Text-to-Text Transfer | Nov 3, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 |
| Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous Data | Oct 24, 2023 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |
| Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images | Oct 20, 2023 | Data AugmentationData-free Knowledge Distillation | —Unverified | 0 |
| DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning | Sep 24, 2023 | Data-free Knowledge DistillationDiversity | —Unverified | 0 |
| DAD++: Improved Data-free Test Time Adversarial Defense | Sep 10, 2023 | Adversarial DefenseAdversarial Robustness | CodeCode Available | 0 |
| Sampling to Distill: Knowledge Transfer from Open-World Data | Jul 31, 2023 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| Mitigating Cross-client GANs-based Attack in Federated Learning | Jul 25, 2023 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| Distribution Shift Matters for Knowledge Distillation with Webly Collected Images | Jul 21, 2023 | Contrastive LearningData-free Knowledge Distillation | —Unverified | 0 |
| Customizing Synthetic Data for Data-Free Student Learning | Jul 10, 2023 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |