ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.08651
326
192
v1v2v3 (latest)

Slope meets Lasso: improved oracle bounds and optimality

27 May 2016
Pierre C. Bellec
Guillaume Lecué
Alexandre B. Tsybakov
ArXiv (abs)PDFHTML
Abstract

We show that two polynomial time methods, a Lasso estimator with adaptively chosen tuning parameter and a Slope estimator, adaptively achieve the exact minimax prediction and ℓ2\ell_2ℓ2​ estimation rate (s/n)log⁡(p/s)(s/n)\log (p/s)(s/n)log(p/s) in high-dimensional linear regression on the class of sss-sparse target vectors in Rp\mathbb R^pRp. This is done under the Restricted Eigenvalue (RE) condition for the Lasso and under a slightly more constraining assumption on the design for the Slope. The main results have the form of sharp oracle inequalities accounting for the model misspecification error. The minimax optimal bounds are also obtained for the ℓq\ell_qℓq​ estimation errors with 1≤q≤21\le q\le 21≤q≤2 when the model is well-specified. The results are non-asymptotic, and hold both in probability and in expectation. The assumptions that we impose on the design are satisfied with high probability for a large class of random matrices with independent and possibly anisotropically distributed rows. We give a comparative analysis of conditions, under which oracle bounds for the Lasso and Slope estimators can be obtained. In particular, we show that several known conditions, such as the RE condition and the sparse eigenvalue condition are equivalent if the ℓ2\ell_2ℓ2​-norms of regressors are uniformly bounded.

View on arXiv
Comments on this paper