50
11

A First Order Free Lunch for SQRT-Lasso

Abstract

Many statistical machine learning techniques sacrifice convenient computational structures to gain estimation robustness and modeling flexibility. In this paper, we study this fundamental tradeoff through a SQRT-Lasso problem for sparse linear regression and sparse precision matrix estimation in high dimensions. We explain how novel optimization techniques help address these computational challenges. Particularly, we propose a pathwise iterative smoothing shrinkage thresholding algorithm for solving the SQRT-Lasso optimization problem. We further provide a novel model-based perspective for analyzing the smoothing optimization framework, which allows us to establish a nearly linear convergence (R-linear convergence) guarantee for our proposed algorithm. This implies that solving the SQRT-Lasso optimization is almost as easy as solving the Lasso optimization. Moreover, we show that our proposed algorithm can also be applied to sparse precision matrix estimation, and enjoys good computational properties. Numerical experiments are provided to support our theory.

View on arXiv
Comments on this paper