A Simple and Efficient Stochastic Algorithm for Decentralized
Nonconvex-Strongly-Concave Minimax Optimization
This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive-gradient descEnt Ascent Method (\texttt{DREAM}), which achieves the best-known theoretical guarantee for finding the -stationary point of the primal function. For the online setting, the proposed method requires stochastic first-order oracle (SFO) calls and communication rounds to find an -stationary point, where is the condition number and is the second-largest eigenvalue of the gossip matrix~. For the offline setting with totally component functions, the proposed method requires SFO calls and the same communication complexity as the online setting.
View on arXiv