ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.18267
62
0

Enhancing Dataset Distillation via Non-Critical Region Refinement

24 March 2025
Minh-Tuan Tran
Trung Le
Xuan-May Le
Thanh-Toan Do
Dinh Q. Phung
    DD
ArXivPDFHTML
Abstract

Dataset distillation has become a popular method for compressing large datasets into smaller, more efficient representations while preserving critical information for model training. Data features are broadly categorized into two types: instance-specific features, which capture unique, fine-grained details of individual examples, and class-general features, which represent shared, broad patterns across a class. However, previous approaches often struggle to balance these features-some focus solely on class-general patterns, neglecting finer instance details, while others prioritize instance-specific features, overlooking the shared characteristics essential for class-level understanding. In this paper, we introduce the Non-Critical Region Refinement Dataset Distillation (NRR-DD) method, which preserves instance-specific details and fine-grained regions in synthetic data while enriching non-critical regions with class-general information. This approach enables models to leverage all pixel information, capturing both feature types and enhancing overall performance. Additionally, we present Distance-Based Representative (DBR) knowledge transfer, which eliminates the need for soft labels in training by relying on the distance between synthetic data predictions and one-hot encoded labels. Experimental results show that NRR-DD achieves state-of-the-art performance on both small- and large-scale datasets. Furthermore, by storing only two distances per instance, our method delivers comparable results across various settings. The code is available atthis https URL.

View on arXiv
@article{tran2025_2503.18267,
  title={ Enhancing Dataset Distillation via Non-Critical Region Refinement },
  author={ Minh-Tuan Tran and Trung Le and Xuan-May Le and Thanh-Toan Do and Dinh Phung },
  journal={arXiv preprint arXiv:2503.18267},
  year={ 2025 }
}
Comments on this paper