ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.05349
21
0

Hyperflows: Pruning Reveals the Importance of Weights

6 April 2025
Eugen Barbulescu
Antonio Alexoaie
ArXivPDFHTML
Abstract

Network pruning is used to reduce inference latency and power consumption in large neural networks. However, most existing methods struggle to accurately assess the importance of individual weights due to their inherent interrelatedness, leading to poor performance, especially at extreme sparsity levels. We introduce Hyperflows, a dynamic pruning approach that estimates each weight's importance by observing the network's gradient response to the weight's removal. A global pressure term continuously drives all weights toward pruning, with those critical for accuracy being automatically regrown based on their flow, the aggregated gradient signal when they are absent. We explore the relationship between final sparsity and pressure, deriving power-law equations similar to those found in neural scaling laws. Empirically, we demonstrate state-of-the-art results with ResNet-50 and VGG-19 on CIFAR-10 and CIFAR-100.

View on arXiv
@article{barbulescu2025_2504.05349,
  title={ Hyperflows: Pruning Reveals the Importance of Weights },
  author={ Eugen Barbulescu and Antonio Alexoaie },
  journal={arXiv preprint arXiv:2504.05349},
  year={ 2025 }
}
Comments on this paper