23
111

Distributed stochastic optimization with gradient tracking over strongly-connected networks

Abstract

In this paper, we study distributed stochastic optimization to minimize a sum of smooth and strongly-convex local cost functions over a network of agents, communicating over a strongly-connected graph. Assuming that each agent has access to a stochastic first-order oracle (SFO\mathcal{SFO}), we propose a novel distributed method, called S\mathcal{S}-AB\mathcal{AB}, where each agent uses an auxiliary variable to asymptotically track the gradient of the global cost in expectation. The S\mathcal{S}-AB\mathcal{AB} algorithm employs row- and column-stochastic weights simultaneously to ensure both consensus and optimality. Since doubly-stochastic weights are not used, S\mathcal{S}-AB\mathcal{AB} is applicable to arbitrary strongly-connected graphs. We show that under a sufficiently small constant step-size, S\mathcal{S}-AB\mathcal{AB} converges linearly (in expected mean-square sense) to a neighborhood of the global minimizer. We present numerical simulations based on real-world data sets to illustrate the theoretical results.

View on arXiv
Comments on this paper