261

Conditional Gradient Algorithms for Norm-Regularized Smooth Convex Optimization

Mathematical programming (Math. Program.), 2013
Abstract

Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone KK, a norm \|\cdot\| and a smooth convex function ff, we want either 1) to minimize the norm over the intersection of the cone and a level set of ff, or 2) to minimize over the cone the sum of ff and a multiple of the norm. We focus on the case where (a) the dimension of the problem is too large to allow for interior point algorithms, (b) \|\cdot\| is "too complicated" to allow for computationally cheap Bregman projections required in the first order proximal algorithms. On the other hand, we assume that the intersection of the unit ball of \|\cdot\| with KK allows for cheap Minimization Oracle capable to minimize linear forms over the intersection. Motivating examples are given by the nuclear norm and KK being the entire space of matrices, or the positive semidefinite cone in the space of symmetric matrices, and the Total Variation norm on the space of 2D images. We discuss versions of the Conditional Gradient algorithm (in its original form aimed at minimizing smooth convex functions over bounded domains given by minimization oracles) capable to handle our problems of interest, provide the related theoretical efficiency estimates and outline some applications.

View on arXiv
Comments on this paper