523

An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Abstract

This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave (NC-SC) minimax problems over a multi-agent network. We propose an efficient algorithm, called the Decentralized Recursive-gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the ϵ\epsilon-stationary point of the primal function. The proposed method requires O(min(κ3ϵ3,Nκ2ϵ2))\mathcal{O}(\min (\kappa^3\epsilon^{-3},\sqrt{N} \kappa^2 \epsilon^{-2} )) stochastic first-order oracle (SFO) calls and O~(κ2ϵ2)\tilde{\mathcal{O}}(\kappa^2 \epsilon^{-2}) communication rounds to find an ϵ\epsilon-stationary point, where κ\kappa is the condition number. DREAM achieves the best-known complexity for both the online and offline setups.

View on arXiv
Comments on this paper