Efficient Algorithms for Smooth Minimax Optimization

This paper studies first order methods for solving smooth minimax optimization problems where is smooth and is concave for each . In terms of , we consider two settings -- strongly convex and nonconvex -- and improve upon the best known rates in both. For strongly-convex , we propose a new algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in iterations, improving over current state-of-the-art rate of . We use this result along with an inexact proximal point method to provide rate for finding stationary points in the nonconvex setting where can be nonconvex. This improves over current best-known rate of . Finally, we instantiate our result for finite nonconvex minimax problems, i.e., , with nonconvex , to obtain convergence rate of total gradient evaluations for finding a stationary point.
View on arXiv