Stochastic Alternating Direction Method of Multipliers with Variance
Reduction for Nonconvex Optimization
In this work, we study the stochastic alternating direction method of multipliers (ADMM) for the nonconvex optimizations, and propose three classes of nonconvex stochastic ADMM with variance reduction, based on different reduced variance stochastic gradients. Specifically, the first class called the nonconvex stochastic variance reduced gradient ADMM (SVRG-ADMM), uses a multi-stage scheme to progressively reduce the variance of stochastic gradients. The second is the nonconvex stochastic average gradient ADMM (SAG-ADMM), which additionally uses the old gradients estimated in the previous iteration. The third called SAGA-ADMM is an extension of the SAG-ADMM method. Moreover, under some mild conditions, we establish the iteration complexity bound of of the proposed methods to reach an -stationary solution of the nonconvex problems. In particular, we provide a general framework to analyze the iteration complexity of these nonconvex stochastic ADMMs with variance reduction. Finally, some numerical experiments demonstrate the effectiveness of our methods. To the best of our knowledge, this is the first analysis on the iteration complexity of the stochastic ADMM for the noncovex optimizations.
View on arXiv