Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2210.06705
Cited By
From Gradient Flow on Population Loss to Learning with Stochastic Gradient Descent
Neural Information Processing Systems (NeurIPS), 2022
13 October 2022
Satyen Kale
Jason D. Lee
Chris De Sa
Ayush Sekhari
Karthik Sridharan
Re-assign community
ArXiv (abs)
PDF
HTML
Github
Papers citing
"From Gradient Flow on Population Loss to Learning with Stochastic Gradient Descent"
4 / 4 papers shown
Can Looped Transformers Learn to Implement Multi-step Gradient Descent for In-context Learning?
International Conference on Machine Learning (ICML), 2024
Khashayar Gatmiry
Nikunj Saunshi
Sashank J. Reddi
Stefanie Jegelka
Sanjiv Kumar
360
42
0
10 Oct 2024
Convergence of continuous-time stochastic gradient descent with applications to deep neural networks
Gabor Lugosi
Eulalia Nualart
364
3
0
11 Sep 2024
Dr. FERMI: A Stochastic Distributionally Robust Fair Empirical Risk Minimization Framework
Sina Baharlouei
Meisam Razaviyayn
FaML
OOD
311
1
0
20 Sep 2023
Convergence of stochastic gradient descent under a local Lojasiewicz condition for deep neural networks
Jing An
Jianfeng Lu
340
6
0
18 Apr 2023
1
Page 1 of 1