ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.04640
  4. Cited By
Sharper Rates for Separable Minimax and Finite Sum Optimization via
  Primal-Dual Extragradient Methods

Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods

9 February 2022
Yujia Jin
Aaron Sidford
Kevin Tian
ArXivPDFHTML

Papers citing "Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods"

4 / 4 papers shown
Title
Negative Stepsizes Make Gradient-Descent-Ascent Converge
Negative Stepsizes Make Gradient-Descent-Ascent Converge
Henry Shugart
Jason M. Altschuler
17
0
0
02 May 2025
Smooth Monotone Stochastic Variational Inequalities and Saddle Point
  Problems: A Survey
Smooth Monotone Stochastic Variational Inequalities and Saddle Point Problems: A Survey
Aleksandr Beznosikov
Boris Polyak
Eduard A. Gorbunov
D. Kovalev
Alexander Gasnikov
32
31
0
29 Aug 2022
The First Optimal Algorithm for Smooth and
  Strongly-Convex-Strongly-Concave Minimax Optimization
The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization
D. Kovalev
Alexander Gasnikov
12
15
0
11 May 2022
Stochastic Variance Reduction for Variational Inequality Methods
Stochastic Variance Reduction for Variational Inequality Methods
Ahmet Alacaoglu
Yura Malitsky
51
68
0
16 Feb 2021
1