52
4

An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

Abstract

This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the ϵ\epsilon-stationary points. Concretely, it requires O(min(κ3ϵ3,κ2Nϵ2))\mathcal{O}(\min (\kappa^3\epsilon^{-3},\kappa^2 \sqrt{N} \epsilon^{-2} )) stochastic first-order oracle (SFO) calls and O~(κ2ϵ2)\tilde{\mathcal{O}}(\kappa^2 \epsilon^{-2}) communication rounds, where κ\kappa is the condition number and NN is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.

View on arXiv
Comments on this paper