88
3

ReX: Finite Sample Inference for Low-Ranks in High-Dimensional Gaussian Vectors through the \ell_\infty Norm

Abstract

It is important to detect a low-dimensional linear dependency in high-dimensional data. We provide a perspective on this problem, called the rank-extreme (ReX) association, through studies of the \ell_\infty norms of possibly degenerate Gaussian vectors whose dimension is pp but has a correlation matrix of rank dpd \le p. We find a precise asymptotic upper bound of such extreme values as d(1p2/d)\sqrt{d(1-p^{-2/d})}. This upper bound is shown to be sharp when the entries of the correlation matrix are generated as inner products of i.i.d. uniformly distributed unit vectors. This upper bound also takes on an interesting trichotomy phenomenon depending on the limit of d/logpd/\log{p}. Based on this ReX approach, we propose several methods for high-dimensional inference. The first application is a test of the overall significance in regressions. The second application is a refinement of valid post-selection inference when the size of selected models is restricted. The third application is an inference method for low-dimension linear dependency. One advantage of this approach is that the asymptotics are in the dimensions dd and pp but not in the sample size nn. Thus, the inference can be made even when n<dpn < d \le p. Furthermore, the higher the dimension is, the more accurate the inference is. Thus, these results can be regarded as a "blessing of dimensionality."

View on arXiv
Comments on this paper