218
v1v2 (latest)

Model selection with lasso-zero: adding straw to the haystack to better find needles

Abstract

The high-dimensional linear model y=Xβ0+ϵy = X \beta^0 + \epsilon is considered and the focus is put on the problem of recovering the support S0S^0 of the sparse vector β0.\beta^0. We introduce Lasso-Zero, a new 1\ell_1-based estimator whose novelty resides in an "overfit, then threshold" paradigm and the use of noise dictionaries concatenated to XX for overfitting the response. To select the threshold, we employ the quantile universal threshold based on a pivotal statistic that requires neither knowledge nor preliminary estimation of the noise level. Numerical simulations show that Lasso-Zero performs well in terms of support recovery and provides an excellent trade-off between high true positive rate and low false discovery rate compared to competitors. Our methodology is supported by theoretical results showing that when no noise dictionary is used, Lasso-Zero recovers the signs of β0\beta^0 under weaker conditions on XX and S0S^0 than the Lasso and achieves sign consistency for correlated Gaussian designs. The use of noise dictionary improves the procedure for low signals.

View on arXiv
Comments on this paper