Second-Order Min-Max Optimization with Lazy Hessians
This paper studies second-order methods for convex-concave minimax optimization. Monteiro and Svaiter (2012) proposed a method to solve the problem with an optimal iteration complexity of to find an -saddle point. However, it is unclear whether the computational complexity, , can be improved. In the above, we follow Doikov et al. (2023) and assume the complexity of obtaining a first-order oracle as and the complexity of obtaining a second-order oracle as . In this paper, we show that the computation cost can be reduced by reusing Hessian across iterations. Our methods take the overall computational complexity of $ \tilde{\mathcal{O}}( (N+d^2)(d+ d^{2/3}\epsilon^{-2/3}))$, which improves those of previous methods by a factor of . Furthermore, we generalize our method to strongly-convex-strongly-concave minimax problems and establish the complexity of when the condition number of the problem is , enjoying a similar speedup upon the state-of-the-art method. Numerical experiments on both real and synthetic datasets also verify the efficiency of our method.
View on arXiv