Project Icon

Awesome-Dataset-Distillation

Comprehensive Overview of Dataset Distillation Methods and Recent Developments

Product DescriptionThis project offers a detailed exploration of dataset distillation, focusing on transforming large datasets into smaller, highly efficient versions. It underscores the importance of distilled datasets in fields such as continual learning, privacy preservation, and neural architecture search. Since its introduction in 2018, the project has captured significant advancements, featuring crucial research efforts, methods like gradient and distribution matching, and practical use cases. Maintained by experts Guang Li, Bo Zhao, and Tongzhou Wang, this compilation acts as an essential resource for those interested in dataset optimization and its diverse applications.
Project Details