A Simple and Efficient Stochastic Algorithm for Decentralized
Nonconvex-Strongly-Concave Minimax Optimization
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Abstract
This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which requires stochastic first-order oracle (SFO) calls and communication rounds to find an -stationary point, where is the condition number and is the second-largest eigenvalue of the gossip matrix . To the best our knowledge, DREAM is the first algorithm whose SFO and communication complexities simultaneously achieve the optimal dependency on and for this problem.
View on arXivComments on this paper
