Lower boundaries for parametric estimations in different norms
E. Ostrovsky
L. Sirota
Abstract
We establish some new non-asymptotical lower bounds for deviation of regular unbiased estimation of unknown parameter from its true value in different norms, alike the classical Rao-Kramer's inequality. We show that if the new norm is weaker that ordinary Hilbertian norm, that the rate of convergence of arbitrary regular unbiased estimate does not exceed $ 1/\sqrt{n}, $ and if the new norm is stronger that one, the rate of convergence of the well-known Maximal Likelihood Estimate (MLE) is also equal to $ 1/\sqrt{n}.
View on arXivComments on this paper
