561

A Simple and Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Abstract

This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which requires O(κ3ϵ3)\mathcal{O}(\kappa^3\epsilon^{-3}) stochastic first-order oracle (SFO) calls and O(κ2ϵ2/1λ2(W))\mathcal{O}\big(\kappa^2\epsilon^{-2}/\sqrt{1-\lambda_2(W)}\,\big) communication rounds to find an ϵ\epsilon-stationary point, where κ\kappa is the condition number and λ2(W)\lambda_2(W) is the second-largest eigenvalue of the gossip matrix WW. To the best our knowledge, DREAM is the first algorithm whose SFO and communication complexities simultaneously achieve the optimal dependency on ϵ\epsilon and λ2(W)\lambda_2(W) for this problem.

View on arXiv
Comments on this paper