123
v1v2 (latest)

An unbiased estimate for the mean of a {0,1} random variable with relative error distribution independent of the mean

Abstract

Say X1,X2,X_1,X_2,\ldots are independent identically distributed Bernoulli random variables with mean pp. This paper builds a new estimate p^\hat p of pp that has the property that the relative error, p^/p1\hat p /p - 1, of the estimate does not depend in any way on the value of pp. This allows the construction of exact confidence intervals for pp of any desired level without needing any sort of limit or approximation. In addition, p^\hat p is unbiased. For ϵ\epsilon and δ\delta in (0,1)(0,1), to obtain an estimate where P(p^/p1>ϵ)δ\mathbb{P}(|\hat p/p - 1| > \epsilon) \leq \delta, the new algorithm takes on average at most 2ϵ2p1ln(2δ1)(1(14/3)ϵ)12\epsilon^{-2} p^{-1}\ln(2\delta^{-1})(1 - (14/3) \epsilon)^{-1} samples. It is also shown that any such algorithm that applies whenever p1/2p \leq 1/2 requires at least 0.2ϵ2p1ln((2δ)δ1)(1+2ϵ)0.2\epsilon^{-2} p^{-1}\ln((2-\delta)\delta^{-1})(1 + 2 \epsilon) samples. The same algorithm can also be applied to estimate the mean of any random variable that falls in [0,1][0,1].

View on arXiv
Comments on this paper