An Optimal Stochastic Algorithm for Decentralized Nonconvex Finite-sum
Optimization
This paper studies the synchronized decentralized nonconvex optimization problem of the form , where is the local function on -th agent of the connected network. We propose a novel stochastic algorithm called DEcentralized probAbilistic Recursive gradiEnt deScenT (DEAREST), which integrates the techniques of variance reduction, gradient tracking and multi-consensus. We construct a Lyapunov function that simultaneously characterizes the function value, the gradient estimation error and the consensus error for the convergence analysis. Based on this measure, we provide a concise proof to show DEAREST requires at most incremental first-order oracle (IFO) calls and communication rounds to find an -stationary point in expectation, where is the smoothness parameter and is the second-largest eigenvalues of the gossip matrix . We can verify both of the IFO complexity and communication complexity match the lower bounds. To the best of our knowledge, DEAREST is the first optimal algorithm for decentralized nonconvex finite-sum optimization.
View on arXiv