193

Faster Stochastic Algorithms for Minimax Optimization under Polyak--Łojasiewicz Conditions

Neural Information Processing Systems (NeurIPS), 2023
Abstract

This paper considers stochastic first-order algorithms for minimax optimization under Polyak--{\L}ojasiewicz (PL) conditions. We propose SPIDER-GDA for solving the finite-sum problem of the form minxmaxyf(x,y)1ni=1nfi(x,y)\min_x \max_y f(x,y)\triangleq \frac{1}{n} \sum_{i=1}^n f_i(x,y), where the objective function f(x,y)f(x,y) is μx\mu_x-PL in xx and μy\mu_y-PL in yy; and each fi(x,y)f_i(x,y) is LL-smooth. We prove SPIDER-GDA could find an ϵ\epsilon-optimal solution within O((n+nκxκy2)log(1/ϵ)){\mathcal O}\left((n + \sqrt{n}\,\kappa_x\kappa_y^2)\log (1/\epsilon)\right) stochastic first-order oracle (SFO) complexity, which is better than the state-of-the-art method whose SFO upper bound is O((n+n2/3κxκy2)log(1/ϵ)){\mathcal O}\big((n + n^{2/3}\kappa_x\kappa_y^2)\log (1/\epsilon)\big), where κxL/μx\kappa_x\triangleq L/\mu_x and κyL/μy\kappa_y\triangleq L/\mu_y. For the ill-conditioned case, we provide an accelerated algorithm to reduce the computational cost further. It achieves O~((n+nκxκy)log2(1/ϵ))\tilde{{\mathcal O}}\big((n+\sqrt{n}\,\kappa_x\kappa_y)\log^2 (1/\epsilon)\big) SFO upper bound when κyn\kappa_y \gtrsim \sqrt{n}. Our ideas also can be applied to the more general setting that the objective function only satisfies PL condition for one variable. Numerical experiments validate the superiority of proposed methods.

View on arXiv
Comments on this paper