This paper studies second-order methods for convex-concave minimax optimization. Monteiro and Svaiter (2012) proposed a method to solve the problem with an optimal iteration complexity of to find an -saddle point. However, it is unclear whether the computational complexity, , can be improved. In the above, we follow Doikov et al. (2023) and assume the complexity of obtaining a first-order oracle as and the complexity of obtaining a second-order oracle as . In this paper, we show that the computation cost can be reduced by reusing Hessian across iterations. Our methods take the overall computational complexity of , which improves those of previous methods by a factor of . Furthermore, we generalize our method to strongly-convex-strongly-concave minimax problems and establish the complexity of when the condition number of the problem is , enjoying a similar speedup upon the state-of-the-art method. Numerical experiments on both real and synthetic datasets also verify the efficiency of our method.
View on arXiv@article{chen2025_2410.09568, title={ Second-Order Min-Max Optimization with Lazy Hessians }, author={ Lesi Chen and Chengchang Liu and Jingzhao Zhang }, journal={arXiv preprint arXiv:2410.09568}, year={ 2025 } }