ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.09647
  4. Cited By
The First Optimal Acceleration of High-Order Methods in Smooth Convex
  Optimization

The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization

19 May 2022
D. Kovalev
Alexander Gasnikov
ArXivPDFHTML

Papers citing "The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization"

6 / 6 papers shown
Title
Faster Acceleration for Steepest Descent
Faster Acceleration for Steepest Descent
Site Bai
Brian Bullins
ODL
19
0
0
28 Sep 2024
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of
  Newton Method
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method
N. Doikov
15
6
0
28 Aug 2023
Explicit Second-Order Min-Max Optimization Methods with Optimal
  Convergence Guarantee
Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee
Tianyi Lin
P. Mertikopoulos
Michael I. Jordan
18
11
0
23 Oct 2022
Smooth Monotone Stochastic Variational Inequalities and Saddle Point
  Problems: A Survey
Smooth Monotone Stochastic Variational Inequalities and Saddle Point Problems: A Survey
Aleksandr Beznosikov
Boris Polyak
Eduard A. Gorbunov
D. Kovalev
Alexander Gasnikov
32
31
0
29 Aug 2022
Perseus: A Simple and Optimal High-Order Method for Variational
  Inequalities
Perseus: A Simple and Optimal High-Order Method for Variational Inequalities
Tianyi Lin
Michael I. Jordan
15
9
0
06 May 2022
Generalized Optimistic Methods for Convex-Concave Saddle Point Problems
Generalized Optimistic Methods for Convex-Concave Saddle Point Problems
Ruichen Jiang
Aryan Mokhtari
20
32
0
19 Feb 2022
1