Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.18471
Cited By
Convergence of AdaGrad for Non-convex Objectives: Simple Proofs and Relaxed Assumptions
29 May 2023
Bo Wang
Huishuai Zhang
Zhirui Ma
Wei Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Convergence of AdaGrad for Non-convex Objectives: Simple Proofs and Relaxed Assumptions"
8 / 8 papers shown
Title
Understanding Gradient Orthogonalization for Deep Learning via Non-Euclidean Trust-Region Optimization
Dmitry Kovalev
52
0
0
16 Mar 2025
Sparklen: A Statistical Learning Toolkit for High-Dimensional Hawkes Processes in Python
Romain Edmond Lacoste
GP
53
0
0
26 Feb 2025
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
Xiaochuan Gong
Jie Hao
Mingrui Liu
31
2
0
28 Sep 2024
An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton Stepsizes
Antonio Orvieto
Lin Xiao
32
2
0
05 Jul 2024
Convergence Guarantees for RMSProp and Adam in Generalized-smooth Non-convex Optimization with Affine Noise Variance
Qi Zhang
Yi Zhou
Shaofeng Zou
27
3
0
01 Apr 2024
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
Yusu Hong
Junhong Lin
38
10
0
06 Feb 2024
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
79
64
0
28 Jul 2020
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
1