ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.13794
  4. Cited By
Revisiting Convergence of AdaGrad with Relaxed Assumptions

Revisiting Convergence of AdaGrad with Relaxed Assumptions

21 February 2024
Yusu Hong
Junhong Lin
ArXivPDFHTML

Papers citing "Revisiting Convergence of AdaGrad with Relaxed Assumptions"

5 / 5 papers shown
Title
Sharp higher order convergence rates for the Adam optimizer
Sharp higher order convergence rates for the Adam optimizer
Steffen Dereich
Arnulf Jentzen
Adrian Riekert
ODL
61
0
0
28 Apr 2025
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
Yusu Hong
Junhong Lin
38
10
0
06 Feb 2024
On the Convergence of AdaGrad(Norm) on $\R^{d}$: Beyond Convexity,
  Non-Asymptotic Rate and Acceleration
On the Convergence of AdaGrad(Norm) on Rd\R^{d}Rd: Beyond Convexity, Non-Asymptotic Rate and Acceleration
Zijian Liu
Ta Duy Nguyen
Alina Ene
Huy Le Nguyen
23
7
0
29 Sep 2022
A High Probability Analysis of Adaptive SGD with Momentum
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
73
58
0
28 Jul 2020
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
1