Mini-Batch Stochastic ADMMs for Nonconvex Nonsmooth Optimization

In the paper, we study the mini-batch stochastic ADMMs (alternating direction method of multipliers) for the nonconvex nonsmooth optimization. We prove that, given an appropriate mini-batch size, the mini-batch stochastic ADMM without variance reduction (VR) technique is convergent and reaches the convergence rate of to obtain a stationary point of the nonconvex optimization, where denotes the number of iterations. Moreover, we extend the mini-batch stochastic gradient method to both the nonconvex SVRG-ADMM and SAGA-ADMM in our initial paper \citep{huang2016stochastic}, and also prove that these mini-batch stochastic ADMMs reach the convergence rate of without the condition on the mini-batch size. In particular, we provide a specific parameter selection for step size of stochastic gradients and penalization parameter of the augmented Lagrangian function. Finally, some experimental results demonstrate the effectiveness of our algorithms.
View on arXiv