ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.13716
29
62

The Lasso with general Gaussian designs with applications to hypothesis testing

27 July 2020
Michael Celentano
Andrea Montanari
Yuting Wei
ArXivPDFHTML
Abstract

The Lasso is a method for high-dimensional regression, which is now commonly used when the number of covariates ppp is of the same order or larger than the number of observations nnn. Classical asymptotic normality theory does not apply to this model due to two fundamental reasons: (1)(1)(1) The regularized risk is non-smooth; (2)(2)(2) The distance between the estimator θ^\widehat{\boldsymbol{\theta}}θ and the true parameters vector θ∗\boldsymbol{\theta}^*θ∗ cannot be neglected. As a consequence, standard perturbative arguments that are the traditional basis for asymptotic normality fail. On the other hand, the Lasso estimator can be precisely characterized in the regime in which both nnn and ppp are large and n/pn/pn/p is of order one. This characterization was first obtained in the case of Gaussian designs with i.i.d. covariates: here we generalize it to Gaussian correlated designs with non-singular covariance structure. This is expressed in terms of a simpler ``fixed-design'' model. We establish non-asymptotic bounds on the distance between the distribution of various quantities in the two models, which hold uniformly over signals θ∗\boldsymbol{\theta}^*θ∗ in a suitable sparsity class and over values of the regularization parameter. As an application, we study the distribution of the debiased Lasso and show that a degrees-of-freedom correction is necessary for computing valid confidence intervals.

View on arXiv
Comments on this paper