Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.13794
Cited By
Revisiting Convergence of AdaGrad with Relaxed Assumptions
21 February 2024
Yusu Hong
Junhong Lin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Convergence of AdaGrad with Relaxed Assumptions"
5 / 5 papers shown
Title
Sharp higher order convergence rates for the Adam optimizer
Steffen Dereich
Arnulf Jentzen
Adrian Riekert
ODL
61
0
0
28 Apr 2025
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
Yusu Hong
Junhong Lin
38
10
0
06 Feb 2024
On the Convergence of AdaGrad(Norm) on
R
d
\R^{d}
R
d
: Beyond Convexity, Non-Asymptotic Rate and Acceleration
Zijian Liu
Ta Duy Nguyen
Alina Ene
Huy Le Nguyen
23
7
0
29 Sep 2022
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
73
58
0
28 Jul 2020
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
1