ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.01805
8
16

Analytic Characterization of the Hessian in Shallow ReLU Models: A Tale of Symmetry

4 August 2020
Yossi Arjevani
M. Field
ArXivPDFHTML
Abstract

We consider the optimization problem associated with fitting two-layers ReLU networks with respect to the squared loss, where labels are generated by a target network. We leverage the rich symmetry structure to analytically characterize the Hessian at various families of spurious minima in the natural regime where the number of inputs ddd and the number of hidden neurons kkk is finite. In particular, we prove that for d≥kd\ge kd≥k standard Gaussian inputs: (a) of the dkdkdk eigenvalues of the Hessian, dk−O(d)dk - O(d)dk−O(d) concentrate near zero, (b) Ω(d)\Omega(d)Ω(d) of the eigenvalues grow linearly with kkk. Although this phenomenon of extremely skewed spectrum has been observed many times before, to our knowledge, this is the first time it has been established {rigorously}. Our analytic approach uses techniques, new to the field, from symmetry breaking and representation theory, and carries important implications for our ability to argue about statistical generalization through local curvature.

View on arXiv
Comments on this paper