ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.09729
24
41

A new regret analysis for Adam-type algorithms

21 March 2020
Ahmet Alacaoglu
Yura Malitsky
P. Mertikopoulos
V. Cevher
    ODL
ArXivPDFHTML
Abstract

In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter β1\beta_{1}β1​ (typically between 0.90.90.9 and 0.990.990.99). In theory, regret guarantees for online convex optimization require a rapidly decaying β1→0\beta_{1}\to0β1​→0 schedule. We show that this is an artifact of the standard analysis and propose a novel framework that allows us to derive optimal, data-dependent regret bounds with a constant β1\beta_{1}β1​, without further assumptions. We also demonstrate the flexibility of our analysis on a wide range of different algorithms and settings.

View on arXiv
Comments on this paper