Higher-order methods for convex-concave min-max optimization and
monotone variational inequalities
SIAM Journal on Optimization (SIOPT), 2020
Abstract
We provide improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness. In min-max settings where the -order derivatives are Lipschitz continuous, we give an algorithm HigherOrderMirrorProx that achieves an iteration complexity of when given access to an oracle for finding a fixed point of a -order equation. We give analogous rates for the weak monotone variational inequality problem. For , our results improve upon the iteration complexity of the first-order Mirror Prox method of Nemirovski [2004] and the second-order method of Monteiro and Svaiter [2012]. We further instantiate our entire algorithm in the unconstrained case.
View on arXivComments on this paper
