Finite-Sample Symmetric Mean Estimation with Fisher Information Rate

The mean of an unknown variance- distribution can be estimated from samples with variance and nearly corresponding subgaussian rate. When is known up to translation, this can be improved asymptotically to , where is the Fisher information of the distribution. Such an improvement is not possible for general unknown , but [Stone, 1975] showed that this asymptotic convergence possible if is about its mean. Stone's bound is asymptotic, however: the required for convergence depends in an unspecified way on the distribution and failure probability . In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every with , we get convergence close to a subgaussian with variance , where is the - Fisher information with smoothing radius that decays polynomially in . Such a bound essentially matches the finite-sample guarantees in the known- setting.
View on arXiv