ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.08011
  4. Cited By
Adaptive first-order methods revisited: Convex optimization without
  Lipschitz requirements

Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

16 July 2021
Kimon Antonakopoulos
P. Mertikopoulos
ArXivPDFHTML

Papers citing "Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements"

3 / 3 papers shown
Title
Distributed Extra-gradient with Optimal Complexity and Communication
  Guarantees
Distributed Extra-gradient with Optimal Complexity and Communication Guarantees
Ali Ramezani-Kebrya
Kimon Antonakopoulos
Igor Krawczuk
Justin Deschenaux
V. Cevher
32
2
0
17 Aug 2023
Grad-GradaGrad? A Non-Monotone Adaptive Stochastic Gradient Method
Grad-GradaGrad? A Non-Monotone Adaptive Stochastic Gradient Method
Aaron Defazio
Baoyu Zhou
Lin Xiao
ODL
14
5
0
14 Jun 2022
Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax
  Optimization
Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax Optimization
Junchi Yang
Xiang Li
Niao He
ODL
27
22
0
01 Jun 2022
1