Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.06251
Cited By
On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting
11 June 2021
Shunta Akiyama
Taiji Suzuki
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting"
7 / 7 papers shown
Title
Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks
Fanghui Liu
L. Dadi
V. Cevher
74
2
0
29 Apr 2024
Annihilation of Spurious Minima in Two-Layer ReLU Networks
Yossi Arjevani
M. Field
16
8
0
12 Oct 2022
Convex Analysis of the Mean Field Langevin Dynamics
Atsushi Nitanda
Denny Wu
Taiji Suzuki
MLT
59
64
0
25 Jan 2022
Parallel Deep Neural Networks Have Zero Duality Gap
Yifei Wang
Tolga Ergen
Mert Pilanci
79
10
0
13 Oct 2021
Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks: A Tale of Symmetry II
Yossi Arjevani
M. Field
28
18
0
21 Jul 2021
A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network
Mo Zhou
Rong Ge
Chi Jin
69
44
0
04 Feb 2021
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
119
577
0
27 Feb 2015
1