Minimax Rates for Estimating the Dimension of a Manifold

Many algorithms in machine learning and computational geometry require, as input, the intrinsic dimension of the manifold that supports the probability distribution of the data. This parameter is rarely known and therefore has to be estimated. We characterize the statistical difficulty of this problem by deriving upper and lower bounds on the minimax rate for estimating the dimension. First, we consider the problem of testing the hypothesis that the support of the data-generating probability distribution is a well-behaved manifold of intrinsic dimension versus the alternative that it is of dimension , with . With an i.i.d. sample of size , we provide an upper bound on the probability of choosing the wrong dimension of , where is an arbitrarily small positive number. The proof is based on bounding the length of the traveling salesman path through the data points. We also demonstrate a lower bound of , by applying Le Cam's lemma with a specific set of -dimensional probability distributions. We then extend these results to get minimax rates for estimating the dimension of well-behaved manifolds. We obtain an upper bound of order and a lower bound of order , where is the embedding dimension.
View on arXiv