SOTAVerified

Dataset Distillation

Dataset distillation is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A dataset distillation algorithm takes as input a large real dataset to be distilled (training set), and outputs a small synthetic distilled dataset, which is evaluated via testing models trained on this distilled dataset on a separate real dataset (validation/test set). A good small distilled dataset is not only useful in dataset understanding, but has various applications (e.g., continual learning, privacy, neural architecture search, etc.).

Papers

Showing 5175 of 216 papers

TitleStatusHype
A Label is Worth a Thousand Images in Dataset DistillationCode1
Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative LatentsCode1
Flowing Datasets with Wasserstein over Wasserstein Gradient FlowsCode1
Are Large-scale Soft Labels Necessary for Large-scale Dataset Distillation?Code1
Soft-Label Dataset Distillation and Text Dataset DistillationCode1
Dataset Distillation via FactorizationCode1
D^4M: Dataset Distillation via Disentangled Diffusion ModelCode1
A Large-Scale Study on Video Action Dataset CondensationCode1
Dataset Distillation via Vision-Language Category PrototypeCode1
Dataset Distillation with Convexified Implicit GradientsCode1
DREAM+: Efficient Dataset Distillation by Bidirectional Representative MatchingCode1
Self-Supervised Dataset Distillation for Transfer LearningCode1
DataDAM: Efficient Dataset Distillation with Attention MatchingCode1
DiM: Distilling Dataset into Generative ModelCode1
Dataset Factorization for CondensationCode1
Embarassingly Simple Dataset DistillationCode1
Dark Distillation: Backdooring Distilled Datasets without Accessing Raw Data0
Dataset Distillation via the Wasserstein Metric0
Curriculum Dataset Distillation0
Dataset Distillation Using Parameter Pruning0
Distribution-aware Dataset Distillation for Efficient Image Restoration0
Dataset Distillation Meets Provable Subset Selection0
Distilling Long-tailed Datasets0
Diversity-Driven Generative Dataset Distillation Based on Diffusion Model with Self-Adaptive Memory0
Dataset Distillation in Medical Imaging: A Feasibility Study0
Show:102550
← PrevPage 3 of 9Next →

No leaderboard results yet.