Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.01581
Cited By
Neural network learns low-dimensional polynomials with SGD near the information-theoretic limit
3 June 2024
Jason D. Lee
Kazusato Oko
Taiji Suzuki
Denny Wu
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Neural network learns low-dimensional polynomials with SGD near the information-theoretic limit"
8 / 8 papers shown
Title
Mean-Field Analysis for Learning Subspace-Sparse Polynomials with Gaussian Input
Ziang Chen
Rong Ge
MLT
53
1
0
10 Jan 2025
Learning Gaussian Multi-Index Models with Gradient Flow: Time Complexity and Directional Convergence
Berfin Simsek
Amire Bendjeddou
Daniel Hsu
32
0
0
13 Nov 2024
Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks
Fanghui Liu
L. Dadi
V. Cevher
61
2
0
29 Apr 2024
Nonlinear spiked covariance matrices and signal propagation in deep neural networks
Zhichao Wang
Denny Wu
Zhou Fan
27
7
0
15 Feb 2024
The Benefits of Reusing Batches for Gradient Descent in Two-Layer Networks: Breaking the Curse of Information and Leap Exponents
Yatin Dandi
Emanuele Troiani
Luca Arnaboldi
Luca Pesce
Lenka Zdeborová
Florent Krzakala
MLT
56
24
0
05 Feb 2024
SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Emmanuel Abbe
Enric Boix-Adserà
Theodor Misiakiewicz
FedML
MLT
76
72
0
21 Feb 2023
Learning Single-Index Models with Shallow Neural Networks
A. Bietti
Joan Bruna
Clayton Sanford
M. Song
150
65
0
27 Oct 2022
Neural Networks Efficiently Learn Low-Dimensional Representations with SGD
Alireza Mousavi-Hosseini
Sejun Park
M. Girotti
Ioannis Mitliagkas
Murat A. Erdogdu
MLT
310
48
0
29 Sep 2022
1