524

A Simple and Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Abstract

This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive-gradient descEnt Ascent Method (\texttt{DREAM}), which achieves the best-known theoretical guarantee for finding the ϵ\epsilon-stationary point of the primal function. For the online setting, the proposed method requires O(κ3ϵ3)\mathcal{O}(\kappa^3\epsilon^{-3}) stochastic first-order oracle (SFO) calls and O(κ2ϵ2/1λ2(W))\mathcal{O}\big(\kappa^2\epsilon^{-2}/\sqrt{1-\lambda_2(W)}\,\big) communication rounds to find an ϵ\epsilon-stationary point, where κ\kappa is the condition number and λ2(W)\lambda_2(W) is the second-largest eigenvalue of the gossip matrix~WW. For the offline setting with totally NN component functions, the proposed method requires O(κ2Nϵ2)\mathcal{O}\big(\kappa^2 \sqrt{N} \epsilon^{-2}\big) SFO calls and the same communication complexity as the online setting.

View on arXiv
Comments on this paper