592

A Stochastic Coordinate Descent Primal-Dual Algorithm and Applications to Large-Scale Composite Optimization

International Workshop on Machine Learning for Signal Processing (MLSP), 2014
Abstract

First, we introduce a splitting algorithm to minimize a sum of three convex functions. The algorithm is of primal dual kind and is inspired by recent results of Vu and Condat. Second, we provide a randomized version of the algorithm based on the idea of coordinate descent. Third, we address two applications of our method. (i) In the case of stochastic minibatch optimization, the algorithm can be used to split a composite objective function into blocks, each of these blocks being processed sequentially by the computer. (ii) In the case of distributed optimization, we consider a set of N agents having private composite objective functions and seeking to find a consensus on the minimum of the aggregate objective. In that case, our method yields a distributed iterative algorithm where each agent use both local computations and message passing in an asynchronous manner. Numerical results demonstrate the attractive performance of the method in the framework of large scale machine learning applications.

View on arXiv
Comments on this paper