591

A Stochastic Coordinate Descent Primal-Dual Algorithm and Applications to Large-Scale Composite Optimization

International Workshop on Machine Learning for Signal Processing (MLSP), 2014
Abstract

Based on the idea of randomized coordinate descent of {\alpha}-averaged operators, we provide a randomized primal-dual algorithm. The algorithm builds upon a variant of a recent (deterministic) algorithm proposed by Vu and Condat. Next, we address two applications of our method. (i) In the case of stochastic approximation methods, the algorithm can be used to split a composite objective function into blocks, each of these blocks being processed sequentially by the computer. (ii) In the case of distributed optimization, we consider a set of N agents having private composite objective functions and seeking to find a consensus on the minimum of the aggregate objective. In that case, our method yields a distributed iterative algorithm where each agent use both local computations and message passing in an asynchronous manner. Numerical results demonstrate the attractive performance of the method in the framework of large scale machine learning applications.

View on arXiv
Comments on this paper