SOTAVerified

Dataset Distillation

Dataset distillation is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A dataset distillation algorithm takes as input a large real dataset to be distilled (training set), and outputs a small synthetic distilled dataset, which is evaluated via testing models trained on this distilled dataset on a separate real dataset (validation/test set). A good small distilled dataset is not only useful in dataset understanding, but has various applications (e.g., continual learning, privacy, neural architecture search, etc.).

Papers

Showing 126150 of 216 papers

TitleStatusHype
On Implicit Bias in Overparameterized Bilevel Optimization0
On Learning Representations for Tabular Data Distillation0
On the Size and Approximation Error of Distilled Sets0
OPTICAL: Leveraging Optimal Transport for Contribution Allocation in Dataset Distillation0
PCPs: Patient Cardiac Prototypes0
Permutation-Invariant and Orientation-Aware Dataset Distillation for 3D Point Clouds0
Practical Dataset Distillation Based on Deep Support Vectors0
Primitive3D: 3D Object Dataset Synthesis from Randomly Assembled Primitives0
Progressive trajectory matching for medical dataset distillation0
QuickDrop: Efficient Federated Unlearning by Integrated Dataset Distillation0
Rethinking Backdoor Attacks on Dataset Distillation: A Kernel Method Perspective0
Rethinking Data Distillation: Do Not Overlook Calibration0
Robust Dataset Distillation by Matching Adversarial Trajectories0
Robust Offline Reinforcement Learning for Non-Markovian Decision Processes0
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching0
Slimmable Dataset Condensation0
Task-Specific Generative Dataset Distillation with Difficulty-Guided Sampling0
The Curse of Unrolling: Rate of Differentiating Through Optimization0
The Evolution of Dataset Distillation: Toward Scalable and Generalizable Solutions0
Towards Efficient Deep Hashing Retrieval: Condensing Your Data via Feature-Embedding Matching0
Towards Stable and Storage-efficient Dataset Distillation: Matching Convexified Trajectory0
Towards Universal Dataset Distillation via Task-Driven Diffusion0
Trust-Aware Diversion for Data-Effective Distillation0
UDD: Dataset Distillation via Mining Underutilized Regions0
Understanding Dataset Distillation via Spectral Filtering0
Show:102550
← PrevPage 6 of 9Next →

No leaderboard results yet.