An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

Abstract
This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the -stationary points. Concretely, it requires stochastic first-order oracle (SFO) calls and communication rounds, where is the condition number and is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.
View on arXivComments on this paper