| Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion Augmentation | Oct 23, 2024 | Data-free Knowledge DistillationDiversity | CodeCode Available | 0 |
| Leveraging Foundation Models for Efficient Federated Learning in Resource-restricted Edge Networks | Sep 14, 2024 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification | Jul 21, 2024 | Data-free Knowledge DistillationImage Generation | —Unverified | 0 |
| Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data | May 6, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning | Apr 22, 2024 | Data-free Knowledge DistillationFederated Learning | —Unverified | 0 |
| Data-free Knowledge Distillation for Fine-grained Visual Categorization | Apr 18, 2024 | Data-free Knowledge DistillationFine-Grained Visual Categorization | CodeCode Available | 0 |
| De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts | Mar 28, 2024 | Causal InferenceData-free Knowledge Distillation | —Unverified | 0 |
| GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation | Mar 28, 2024 | Data-free Knowledge DistillationKnowledge Distillation | —Unverified | 0 |
| AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation | Mar 11, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |
| Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation | Feb 18, 2024 | Data-free Knowledge DistillationKnowledge Distillation | CodeCode Available | 0 |