Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic
Programming
We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors is large, possibly much larger than , but only regressors are significant. The method is a modification of LASSO, called square-root LASSO. The method neither relies on the knowledge of the standard deviation of the regression errors nor does it need to pre-estimate . Despite not knowing , square-root LASSO achieves near-oracle performance, attaining the convergence rate , and thus matching the performance of the standard LASSO that knows . Moreover, we show that these results are valid for both Gaussian and non-Gaussian errors, under some mild moment restrictions, using moderate deviation theory. Finally, we formulate the square-root LASSO as a solution to a convex conic programming problem, which allows us to use efficient computational methods, such as interior point methods, to implement the estimator.
View on arXiv