ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.06188
  4. Cited By
On Random Matrices Arising in Deep Neural Networks. Gaussian Case

On Random Matrices Arising in Deep Neural Networks. Gaussian Case

17 January 2020
L. Pastur
ArXivPDFHTML

Papers citing "On Random Matrices Arising in Deep Neural Networks. Gaussian Case"

5 / 5 papers shown
Title
Universal characteristics of deep neural network loss surfaces from
  random matrix theory
Universal characteristics of deep neural network loss surfaces from random matrix theory
Nicholas P. Baskerville
J. Keating
F. Mezzadri
J. Najnudel
Diego Granziol
22
4
0
17 May 2022
Concentration of Random Feature Matrices in High-Dimensions
Concentration of Random Feature Matrices in High-Dimensions
Zhijun Chen
Hayden Schaeffer
Rachel A. Ward
20
6
0
14 Apr 2022
Conditioning of Random Feature Matrices: Double Descent and
  Generalization Error
Conditioning of Random Feature Matrices: Double Descent and Generalization Error
Zhijun Chen
Hayden Schaeffer
35
12
0
21 Oct 2021
Asymptotic Freeness of Layerwise Jacobians Caused by Invariance of
  Multilayer Perceptron: The Haar Orthogonal Case
Asymptotic Freeness of Layerwise Jacobians Caused by Invariance of Multilayer Perceptron: The Haar Orthogonal Case
B. Collins
Tomohiro Hayase
22
7
0
24 Mar 2021
On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case
On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case
L. Pastur
V. Slavin
CML
22
12
0
20 Nov 2020
1