Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.03982
Cited By
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
6 February 2024
Yusu Hong
Junhong Lin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions"
8 / 8 papers shown
Title
LDAdam: Adaptive Optimization from Low-Dimensional Gradient Statistics
Thomas Robert
M. Safaryan
Ionut-Vlad Modoranu
Dan Alistarh
ODL
26
2
0
21 Oct 2024
Large Batch Analysis for Adagrad Under Anisotropic Smoothness
Yuxing Liu
Rui Pan
Tong Zhang
19
0
0
21 Jun 2024
The Implicit Bias of Adam on Separable Data
Chenyang Zhang
Difan Zou
Yuan Cao
AI4CE
29
1
0
15 Jun 2024
Revisiting Convergence of AdaGrad with Relaxed Assumptions
Yusu Hong
Junhong Lin
23
8
0
21 Feb 2024
Convergence of Adam Under Relaxed Assumptions
Haochuan Li
Alexander Rakhlin
Ali Jadbabaie
21
26
0
27 Apr 2023
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
67
58
0
28 Jul 2020
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
242
35,884
0
25 Aug 2016
1