Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.09304
Cited By
Deformed semicircle law and concentration of nonlinear random matrices for ultra-wide neural networks
20 September 2021
Zhichao Wang
Yizhe Zhu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Deformed semicircle law and concentration of nonlinear random matrices for ultra-wide neural networks"
7 / 7 papers shown
Title
How Spurious Features Are Memorized: Precise Analysis for Random and NTK Features
Simone Bombari
Marco Mondelli
AAML
19
4
0
20 May 2023
Beyond the Universal Law of Robustness: Sharper Laws for Random Features and Neural Tangent Kernels
Simone Bombari
Shayan Kiyani
Marco Mondelli
AAML
28
10
0
03 Feb 2023
High-dimensional Asymptotics of Feature Learning: How One Gradient Step Improves the Representation
Jimmy Ba
Murat A. Erdogdu
Taiji Suzuki
Zhichao Wang
Denny Wu
Greg Yang
MLT
31
121
0
03 May 2022
Concentration of Random Feature Matrices in High-Dimensions
Zhijun Chen
Hayden Schaeffer
Rachel A. Ward
20
6
0
14 Apr 2022
On the Proof of Global Convergence of Gradient Descent for Deep ReLU Networks with Linear Widths
Quynh N. Nguyen
33
49
0
24 Jan 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
Sharp analysis of low-rank kernel matrix approximations
Francis R. Bach
80
277
0
09 Aug 2012
1