An Efficient Stochastic Algorithm for Decentralized
Nonconvex-Strongly-Concave Minimax Optimization
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Abstract
This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave (NC-SC) minimax problems over a multi-agent network. We propose an efficient algorithm, called the Decentralized Recursive-gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the -stationary point of the primal function. The proposed method requires stochastic first-order oracle (SFO) calls and communication rounds to find an -stationary point, where is the condition number. DREAM achieves the best-known complexity for both the online and offline setups.
View on arXivComments on this paper
