ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02809
14
0

Towards Quantifying the Hessian Structure of Neural Networks

5 May 2025
Zhaorui Dong
Yushun Zhang
Z. Luo
Jianfeng Yao
Ruoyu Sun
ArXivPDFHTML
Abstract

Empirical studies reported that the Hessian matrix of neural networks (NNs) exhibits a near-block-diagonal structure, yet its theoretical foundation remains unclear. In this work, we reveal two forces that shape the Hessian structure: a ``static force'' rooted in the architecture design, and a ``dynamic force'' arisen from training. We then provide a rigorous theoretical analysis of ``static force'' at random initialization. We study linear models and 1-hidden-layer networks with the mean-square (MSE) loss and the Cross-Entropy (CE) loss for classification tasks. By leveraging random matrix theory, we compare the limit distributions of the diagonal and off-diagonal Hessian blocks and find that the block-diagonal structure arises as C→∞C \rightarrow \inftyC→∞, where CCC denotes the number of classes. Our findings reveal that CCC is a primary driver of the near-block-diagonal structure. These results may shed new light on the Hessian structure of large language models (LLMs), which typically operate with a large CCC exceeding 10410^4104 or 10510^5105.

View on arXiv
@article{dong2025_2505.02809,
  title={ Towards Quantifying the Hessian Structure of Neural Networks },
  author={ Zhaorui Dong and Yushun Zhang and Zhi-Quan Luo and Jianfeng Yao and Ruoyu Sun },
  journal={arXiv preprint arXiv:2505.02809},
  year={ 2025 }
}
Comments on this paper