ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.04528
17
36

Higher-order methods for convex-concave min-max optimization and monotone variational inequalities

9 July 2020
Brian Bullins
Kevin A. Lai
ArXivPDFHTML
Abstract

We provide improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness. In min-max settings where the pthp^{th}pth-order derivatives are Lipschitz continuous, we give an algorithm HigherOrderMirrorProx that achieves an iteration complexity of O(1/Tp+12)O(1/T^{\frac{p+1}{2}})O(1/T2p+1​) when given access to an oracle for finding a fixed point of a pthp^{th}pth-order equation. We give analogous rates for the weak monotone variational inequality problem. For p>2p>2p>2, our results improve upon the iteration complexity of the first-order Mirror Prox method of Nemirovski [2004] and the second-order method of Monteiro and Svaiter [2012]. We further instantiate our entire algorithm in the unconstrained p=2p=2p=2 case.

View on arXiv
Comments on this paper