14
5

Finite-Sample Symmetric Mean Estimation with Fisher Information Rate

Abstract

The mean of an unknown variance-σ2\sigma^2 distribution ff can be estimated from nn samples with variance σ2n\frac{\sigma^2}{n} and nearly corresponding subgaussian rate. When ff is known up to translation, this can be improved asymptotically to 1nI\frac{1}{n\mathcal I}, where I\mathcal I is the Fisher information of the distribution. Such an improvement is not possible for general unknown ff, but [Stone, 1975] showed that this asymptotic convergence is\textit{is} possible if ff is symmetric\textit{symmetric} about its mean. Stone's bound is asymptotic, however: the nn required for convergence depends in an unspecified way on the distribution ff and failure probability δ\delta. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every f,n,δf, n, \delta with n>log1δn > \log \frac{1}{\delta}, we get convergence close to a subgaussian with variance 1nIr\frac{1}{n \mathcal I_r}, where Ir\mathcal I_r is the rr-smoothed\textit{smoothed} Fisher information with smoothing radius rr that decays polynomially in nn. Such a bound essentially matches the finite-sample guarantees in the known-ff setting.

View on arXiv
Comments on this paper