ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.05133
31
10

Model selection with lasso-zero: adding straw to the haystack to better find needles

14 May 2018
Pascaline Descloux
S. Sardy
ArXivPDFHTML
Abstract

The high-dimensional linear model y=Xβ0+ϵy = X \beta^0 + \epsilony=Xβ0+ϵ is considered and the focus is put on the problem of recovering the support S0S^0S0 of the sparse vector β0.\beta^0.β0. We introduce Lasso-Zero, a new ℓ1\ell_1ℓ1​-based estimator whose novelty resides in an "overfit, then threshold" paradigm and the use of noise dictionaries concatenated to XXX for overfitting the response. To select the threshold, we employ the quantile universal threshold based on a pivotal statistic that requires neither knowledge nor preliminary estimation of the noise level. Numerical simulations show that Lasso-Zero performs well in terms of support recovery and provides an excellent trade-off between high true positive rate and low false discovery rate compared to competitors. Our methodology is supported by theoretical results showing that when no noise dictionary is used, Lasso-Zero recovers the signs of β0\beta^0β0 under weaker conditions on XXX and S0S^0S0 than the Lasso and achieves sign consistency for correlated Gaussian designs. The use of noise dictionary improves the procedure for low signals.

View on arXiv
Comments on this paper