Testing for change points in sequences of high-dimensional covariance matrices is an important and equally challenging problem in statistical methodology with applications in various fields. Motivated by the observation that even in cases where the ratio between dimension and sample size is as small as , tests based on a fixed-dimension asymptotics do not keep their preassigned level, we propose to derive critical values of test statistics using an asymptotic regime where the dimension diverges at the same rate as the sample size. This paper introduces a novel and well-founded statistical methodology for detecting change points in a sequence of high-dimensional covariance matrices. Our approach utilizes a min-type statistic based on a sequential process of likelihood ratio statistics. This is used to construct a test for the hypothesis of the existence of a change point with a corresponding estimator for its location. We provide theoretical guarantees for these inference tools by thoroughly analyzing the asymptotic properties of the sequential process of likelihood ratio statistics in the case where the dimension and sample size converge with the same rate to infinity. In particular, we prove weak convergence towards a Gaussian process under the null hypothesis of no change. To identify the challenging dependency structure between consecutive test statistics, we employ tools from random matrix theory and stochastic processes. Moreover, we show that the new test attains power under a class of alternatives reflecting changes in the bulk of the spectrum, and we prove consistency of the estimator for the change-point location.
View on arXiv