721

Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming

Abstract

We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors pp is large, possibly much larger than nn, but only ss regressors are significant. The method is a modification of LASSO, called square-root LASSO. The method neither relies on the knowledge of the standard deviation σ\sigma of the regression errors nor does it need to pre-estimate σ\sigma. Despite not knowing σ\sigma, square-root LASSO achieves near-oracle performance, attaining the convergence rate σ(s/n)logp\sigma \sqrt{(s/n)\log p}, and thus matching the performance of the standard LASSO that knows σ\sigma. Moreover, we show that these results are valid for both Gaussian and non-Gaussian errors, under some mild moment restrictions, using moderate deviation theory. Finally, we formulate the square-root LASSO as a solution to a convex conic programming problem, which allows us to use efficient computational methods, such as interior point methods, to implement the estimator.

View on arXiv
Comments on this paper