ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.05704
31
0

Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation

9 June 2024
Xinhao Zhong
Hao Fang
Bin Chen
Xulin Gu
Tao Dai
Meikang Qiu
Shu-Tao Xia
    DD
ArXivPDFHTML
Abstract

Dataset distillation is an emerging dataset reduction method, which condenses large-scale datasets while maintaining task accuracy. Current parameterization methods achieve enhanced performance under extremely high compression ratio by optimizing determined synthetic dataset in informative feature domain. However, they limit themselves to a fixed optimization space for distillation, neglecting the diverse guidance across different informative latent spaces. To overcome this limitation, we propose a novel parameterization method dubbed Hierarchical Parameterization Distillation (H-PD), to systematically explore hierarchical feature within provided feature space (e.g., layers within pre-trained generative adversarial networks). We verify the correctness of our insights by applying the hierarchical optimization strategy on GAN-based parameterization method. In addition, we introduce a novel class-relevant feature distance metric to alleviate the computational burden associated with synthetic dataset evaluation, bridging the gap between synthetic and original datasets. Experimental results demonstrate that the proposed H-PD achieves a significant performance improvement under various settings with equivalent time consumption, and even surpasses current generative distillation using diffusion models under extreme compression ratios IPC=1 and IPC=10.

View on arXiv
@article{zhong2025_2406.05704,
  title={ Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation },
  author={ Xinhao Zhong and Hao Fang and Bin Chen and Xulin Gu and Meikang Qiu and Shuhan Qi and Shu-Tao Xia },
  journal={arXiv preprint arXiv:2406.05704},
  year={ 2025 }
}
Comments on this paper