ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1306.0623
88
3
v1v2v3v4v5v6v7 (latest)

Asymptotics of the ℓ∞\ell_\inftyℓ∞​ norm of Gaussian Vectors and Finite Sample Inference for Low-Dimensional Linear Dependency

3 June 2013
Kai Zhang
ArXiv (abs)PDFHTML
Abstract

It is important to detect a low-dimensional linear dependency in high-dimensional data. We provide a perspective on this problem through studies on the ℓ∞\ell_\inftyℓ∞​ norms of possibly degenerate Gaussian vectors whose dimension is ppp but has a correlation matrix of rank d≤pd \le pd≤p. We find a precise asymptotic upper bound of such extreme values as d(1−p−2/d)\sqrt{d(1-p^{-2/d})}d(1−p−2/d)​. This upper bound is shown to be sharp when the entries of the correlation matrix are generated as inner products of i.i.d. uniformly distributed unit vectors. This upper bound also takes on an interesting trichotomy behavior for different ranges of ddd. Based on these results, we propose several methods for high-dimensional inference. The first application is a general hard threshold rule for variable selection in regressions. The second application is a refinement of valid post-selection inference when the size of selected models is restricted. The third application is an inference method for low-dimension linear dependency. One advantage of this approach is that the asymptotics are in the dimensions ddd and ppp but not in the sample size nnn. Thus, the inference can be made even when n<d≤pn < d \le pn<d≤p. Furthermore, the higher the dimension is, the more accurate the inference is.

View on arXiv
Comments on this paper