ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.09390
20
3

Deterministic Finite-Memory Bias Estimation

19 June 2022
Tomer Berg
Or Ordentlich
O. Shayevitz
ArXiv (abs)PDFHTML
Abstract

In this paper we consider the problem of estimating a Bernoulli parameter using finite memory. Let X1,X2,…X_1,X_2,\ldotsX1​,X2​,… be a sequence of independent identically distributed Bernoulli random variables with expectation θ\thetaθ, where θ∈[0,1]\theta \in [0,1]θ∈[0,1]. Consider a finite-memory deterministic machine with SSS states, that updates its state Mn∈{1,2,…,S}M_n \in \{1,2,\ldots,S\}Mn​∈{1,2,…,S} at each time according to the rule Mn=f(Mn−1,Xn)M_n = f(M_{n-1},X_n)Mn​=f(Mn−1​,Xn​), where fff is a deterministic time-invariant function. Assume that the machine outputs an estimate at each time point according to some fixed mapping from the state space to the unit interval. The quality of the estimation procedure is measured by the asymptotic risk, which is the long-term average of the instantaneous quadratic risk. The main contribution of this paper is an upper bound on the smallest worst-case asymptotic risk any such machine can attain. This bound coincides with a lower bound derived by Leighton and Rivest, to imply that Θ(1/S)\Theta(1/S)Θ(1/S) is the minimax asymptotic risk for deterministic SSS-state machines. In particular, our result disproves a longstanding Θ(log⁡S/S)\Theta(\log S/S)Θ(logS/S) conjecture for this quantity, also posed by Leighton and Rivest.

View on arXiv
Comments on this paper