ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.08181
  4. Cited By
A Random Matrix Theory Approach to Damping in Deep Learning

A Random Matrix Theory Approach to Damping in Deep Learning

15 November 2020
Diego Granziol
Nicholas P. Baskerville
    AI4CE
    ODL
ArXivPDFHTML

Papers citing "A Random Matrix Theory Approach to Damping in Deep Learning"

3 / 3 papers shown
Title
Universal characteristics of deep neural network loss surfaces from
  random matrix theory
Universal characteristics of deep neural network loss surfaces from random matrix theory
Nicholas P. Baskerville
J. Keating
F. Mezzadri
J. Najnudel
Diego Granziol
22
4
0
17 May 2022
Cleaning large correlation matrices: tools from random matrix theory
Cleaning large correlation matrices: tools from random matrix theory
J. Bun
J. Bouchaud
M. Potters
27
262
0
25 Oct 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
1