ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.04676
26
6

Group Distributionally Robust Dataset Distillation with Risk Minimization

7 February 2024
Saeed Vahidian
Mingyu Wang
Jianyang Gu
Vyacheslav Kungurtsev
Wei Jiang
Yiran Chen
    OOD
    DD
ArXivPDFHTML
Abstract

Dataset distillation (DD) has emerged as a widely adopted technique for crafting a synthetic dataset that captures the essential information of a training dataset, facilitating the training of accurate neural models. Its applications span various domains, including transfer learning, federated learning, and neural architecture search. The most popular methods for constructing the synthetic data rely on matching the convergence properties of training the model with the synthetic dataset and the training dataset. However, using the empirical loss as the criterion must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest. Yet despite its popularity, an aspect that remains unexplored is the relationship of DD to its generalization, particularly across uncommon subgroups. That is, how can we ensure that a model trained on the synthetic dataset performs well when faced with samples from regions with low population density? Here, the representativeness and coverage of the dataset become salient over the guaranteed training error at inference. Drawing inspiration from distributionally robust optimization, we introduce an algorithm that combines clustering with the minimization of a risk measure on the loss to conduct DD. We provide a theoretical rationale for our approach and demonstrate its effective generalization and robustness across subgroups through numerical experiments. The source code is available atthis https URL.

View on arXiv
@article{vahidian2025_2402.04676,
  title={ Group Distributionally Robust Dataset Distillation with Risk Minimization },
  author={ Saeed Vahidian and Mingyu Wang and Jianyang Gu and Vyacheslav Kungurtsev and Wei Jiang and Yiran Chen },
  journal={arXiv preprint arXiv:2402.04676},
  year={ 2025 }
}
Comments on this paper