Conditional Gradient Algorithms for Norm-Regularized Smooth Convex
Optimization
Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone , a norm and a smooth convex function , we want either 1) to minimize the norm over the intersection of the cone and a level set of , or 2) to minimize over the cone the sum of and a multiple of the norm. We focus on the case where (a) the dimension of the problem is too large to allow for interior point algorithms, (b) is "too complicated" to allow for computationally cheap Bregman projections required in the first order proximal algorithms. On the other hand, we assume that the intersection of the unit ball of with allows for cheap Minimization Oracle capable to minimize linear forms over the intersection. Motivating examples are given by the nuclear norm and being the entire space of matrices, or the positive semidefinite cone in the space of symmetric matrices, and the Total Variation norm on the space of 2D images. We discuss versions of the Conditional Gradient algorithm (in its original form aimed at minimizing smooth convex functions over bounded domains given by minimization oracles) capable to handle our problems of interest, provide the related theoretical efficiency estimates and outline some applications.
View on arXiv