ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.01567
  4. Cited By
Understanding Adam Optimizer via Online Learning of Updates: Adam is
  FTRL in Disguise

Understanding Adam Optimizer via Online Learning of Updates: Adam is FTRL in Disguise

2 February 2024
Kwangjun Ahn
Zhiyu Zhang
Yunbum Kook
Yan Dai
ArXivPDFHTML

Papers citing "Understanding Adam Optimizer via Online Learning of Updates: Adam is FTRL in Disguise"

6 / 6 papers shown
Title
A Fine-Tuning Approach for T5 Using Knowledge Graphs to Address Complex Tasks
A Fine-Tuning Approach for T5 Using Knowledge Graphs to Address Complex Tasks
Xiaoxuan Liao
Binrong Zhu
Jacky He
Guiran Liu
Hongye Zheng
Jia Gao
35
5
0
23 Feb 2025
Does SGD really happen in tiny subspaces?
Does SGD really happen in tiny subspaces?
Minhak Song
Kwangjun Ahn
Chulhee Yun
47
4
1
25 May 2024
Convergence of Adam Under Relaxed Assumptions
Convergence of Adam Under Relaxed Assumptions
Haochuan Li
Alexander Rakhlin
Ali Jadbabaie
26
53
0
27 Apr 2023
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on
  Transformers, but Sign Descent Might Be
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on Transformers, but Sign Descent Might Be
Frederik Kunstner
Jacques Chen
J. Lavington
Mark W. Schmidt
38
66
0
27 Apr 2023
A new regret analysis for Adam-type algorithms
A new regret analysis for Adam-type algorithms
Ahmet Alacaoglu
Yura Malitsky
P. Mertikopoulos
V. Cevher
ODL
38
41
0
21 Mar 2020
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
1