ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.02268
  4. Cited By
Restarts subject to approximate sharpness: A parameter-free and optimal
  scheme for first-order methods

Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods

5 January 2023
Ben Adcock
Matthew J. Colbrook
Maksym Neyra-Nesterenko
ArXivPDFHTML

Papers citing "Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods"

3 / 3 papers shown
Title
Acceleration Methods
Acceleration Methods
Alexandre d’Aspremont
Damien Scieur
Adrien B. Taylor
78
97
0
23 Jan 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
119
1,190
0
16 Aug 2016
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
97
1,150
0
04 Mar 2015
1