ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.13826
  4. Cited By
Last-Iterate Convergence of Saddle-Point Optimizers via High-Resolution
  Differential Equations

Last-Iterate Convergence of Saddle-Point Optimizers via High-Resolution Differential Equations

27 December 2021
Tatjana Chavdarova
Michael I. Jordan
Manolis Zampetakis
ArXivPDFHTML

Papers citing "Last-Iterate Convergence of Saddle-Point Optimizers via High-Resolution Differential Equations"

4 / 4 papers shown
Title
Learning Variational Inequalities from Data: Fast Generalization Rates under Strong Monotonicity
Learning Variational Inequalities from Data: Fast Generalization Rates under Strong Monotonicity
Eric Zhao
Tatjana Chavdarova
Michael I. Jordan
43
0
0
20 Feb 2025
On a continuous time model of gradient descent dynamics and instability
  in deep learning
On a continuous time model of gradient descent dynamics and instability in deep learning
Mihaela Rosca
Yan Wu
Chongli Qin
Benoit Dherin
16
6
0
03 Feb 2023
On Solving Minimax Optimization Locally: A Follow-the-Ridge Approach
On Solving Minimax Optimization Locally: A Follow-the-Ridge Approach
Yuanhao Wang
Guodong Zhang
Jimmy Ba
29
100
0
16 Oct 2019
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
97
1,151
0
04 Mar 2015
1