SOTAVerified

Dataset Distillation

Dataset distillation is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A dataset distillation algorithm takes as input a large real dataset to be distilled (training set), and outputs a small synthetic distilled dataset, which is evaluated via testing models trained on this distilled dataset on a separate real dataset (validation/test set). A good small distilled dataset is not only useful in dataset understanding, but has various applications (e.g., continual learning, privacy, neural architecture search, etc.).

Papers

Showing 150 of 216 papers

TitleStatusHype
Dataset Distillation with Neural Characteristic Function: A Minmax PerspectiveCode3
DD-Ranking: Rethinking the Evaluation of Dataset DistillationCode2
FedCache 2.0: Federated Edge Learning with Knowledge Caching and Dataset DistillationCode2
Self-supervised Dataset Distillation: A Good Compression Is All You NeedCode2
Dataset QuantizationCode2
Dataset Distillation by Matching Training TrajectoriesCode2
FADRM: Fast and Accurate Data Residual Matching for Dataset DistillationCode1
Dataset Distillation via Vision-Language Category PrototypeCode1
CaO_2: Rectifying Inconsistencies in Diffusion-Based Dataset DistillationCode1
Flowing Datasets with Wasserstein over Wasserstein Gradient FlowsCode1
OD3: Optimization-free Dataset Distillation for Object DetectionCode1
Taming Diffusion for Dataset Distillation with High RepresentativenessCode1
Distilling Dataset into Neural FieldCode1
Dataset Distillation via Committee VotingCode1
A Large-Scale Study on Video Action Dataset CondensationCode1
DELT: A Simple Diversity-driven EarlyLate Training for Dataset DistillationCode1
Emphasizing Discriminative Features for Dataset Distillation in Complex ScenariosCode1
Are Large-scale Soft Labels Necessary for Large-scale Dataset Distillation?Code1
Generative Dataset Distillation Based on Diffusion ModelCode1
Prioritize Alignment in Dataset DistillationCode1
D^4M: Dataset Distillation via Disentangled Diffusion ModelCode1
Dataset Quantization with Active Learning based Adaptive SamplingCode1
A Label is Worth a Thousand Images in Dataset DistillationCode1
What is Dataset Distillation Learning?Code1
Low-Rank Similarity Mining for Multimodal Dataset DistillationCode1
Efficiency for Free: Ideal Data Are Transportable RepresentationsCode1
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero CostCode1
Exploiting Inter-sample and Inter-feature Relations in Dataset DistillationCode1
DiLM: Distilling Dataset into Language Model for Text-level Dataset DistillationCode1
Distilling Datasets Into Less Than One ImageCode1
Improve Cross-Architecture Generalization on Dataset DistillationCode1
Group Distributionally Robust Dataset Distillation with Risk MinimizationCode1
D^4: Dataset Distillation via Disentangled Diffusion ModelCode1
On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation ParadigmCode1
Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative LatentsCode1
Dancing with Still Images: Video Distillation via Static-Dynamic DisentanglementCode1
Dataset Distillation via Curriculum Data Synthesis in Large Data EraCode1
Efficient Dataset Distillation via Minimax DiffusionCode1
Frequency Domain-based Dataset DistillationCode1
Embarassingly Simple Dataset DistillationCode1
Label Poisoning is All You NeedCode1
DREAM+: Efficient Dataset Distillation by Bidirectional Representative MatchingCode1
Does Graph Distillation See Like Vision Dataset Counterpart?Code1
Self-Supervised Dataset Distillation for Transfer LearningCode1
Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory MatchingCode1
Can pre-trained models assist in dataset distillation?Code1
DataDAM: Efficient Dataset Distillation with Attention MatchingCode1
Vision-Language Dataset DistillationCode1
Towards Trustworthy Dataset DistillationCode1
Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New PerspectiveCode1
Show:102550
← PrevPage 1 of 5Next →

No leaderboard results yet.