11
12

Multivariate mean estimation with direction-dependent accuracy

Gabor Lugosi
S. Mendelson
Abstract

We consider the problem of estimating the mean of a random vector based on NN independent, identically distributed observations. We prove the existence of an estimator that has a near-optimal error in all directions in which the variance of the one dimensional marginal of the random vector is not too small: with probability 1δ1-\delta, the procedure returns \whμN\wh{\mu}_N which satisfies that for every direction uSd1u \in S^{d-1}, \[ \inr{\wh{\mu}_N - \mu, u}\le \frac{C}{\sqrt{N}} \left( \sigma(u)\sqrt{\log(1/\delta)} + \left(\E\|X-\EXP X\|_2^2\right)^{1/2} \right)~, \] where σ2(u)=\var(\inrX,u)\sigma^2(u) = \var(\inr{X,u}) and CC is a constant. To achieve this, we require only slightly more than the existence of the covariance matrix, in the form of a certain moment-equivalence assumption. The proof relies on novel bounds for the ratio of empirical and true probabilities that hold uniformly over certain classes of random variables.

View on arXiv
Comments on this paper