Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.12969
Cited By
Demystifying the Myths and Legends of Nonconvex Convergence of SGD
19 October 2023
Aritra Dutta
El Houcine Bergou
Soumia Boucherouite
Nicklas Werge
M. Kandemir
Xin Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Demystifying the Myths and Legends of Nonconvex Convergence of SGD"
5 / 5 papers shown
Title
Stochastic Approximation Beyond Gradient for Signal Processing and Machine Learning
Aymeric Dieuleveut
G. Fort
Eric Moulines
Hoi-To Wai
40
12
0
22 Feb 2023
On the Convergence of Step Decay Step-Size for Stochastic Optimization
Xiaoyu Wang
Sindri Magnússon
M. Johansson
39
23
0
18 Feb 2021
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
Stochastic Nonconvex Optimization with Large Minibatches
Weiran Wang
Nathan Srebro
26
26
0
25 Sep 2017
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
99
570
0
08 Dec 2012
1