We study the detection of a sparse change in a high-dimensional mean vector as a minimax testing problem. Our first main contribution is to derive the exact minimax testing rate across all parameter regimes for independent, -variate Gaussian observations. This rate exhibits a phase transition when the sparsity level is of order and has a very delicate dependence on the sample size: in a certain sparsity regime it involves a triple iterated logarithmic factor in~. Further, in a dense asymptotic regime, we identify the sharp leading constant, while in the corresponding sparse asymptotic regime, this constant is determined to within a factor of . Extensions that cover spatial and temporal dependence, primarily in the dense case, are also provided.
View on arXiv