A nonsmooth dynamical systems perspective on accelerated extensions of
ADMM
The acceleration technique introduced by Nesterov for gradient descent is widely used in optimization but its principles are not yet fully understood. Recently, significant progress has been made to close this understanding gap through a continuous time dynamical systems perspective associated with gradient based methods for smooth and unconstrained problems. Here we extend this perspective to nonsmooth and constrained problems by deriving nonsmooth dynamical systems related to variants of the relaxed and accelerated alternating direction method of multipliers (ADMM). More specifically, we introduce two new accelerated ADMM variants, depending on two types of dissipation, and derive differential inclusions that model these algorithms in the continuous time limit. Through a nonsmooth Lyapunov analysis, we obtain rates of convergence for these dynamical systems in the convex and strongly convex settings that illustrate an interesting tradeoff between decaying versus constant damping strategies.
View on arXiv