44
12
v1v2 (latest)

Bayesian predictive inference without a prior

Abstract

Let (Xn:n1)(X_n:n\ge 1) be a sequence of random observations. Let σn()=P(Xn+1X1,,Xn)\sigma_n(\cdot)=P\bigl(X_{n+1}\in\cdot\mid X_1,\ldots,X_n\bigr) be the nn-th predictive distribution and σ0()=P(X1)\sigma_0(\cdot)=P(X_1\in\cdot) the marginal distribution of X1X_1. In a Bayesian framework, to make predictions on (Xn)(X_n), one only needs the collection σ=(σn:n0)\sigma=(\sigma_n:n\ge 0). Because of the Ionescu-Tulcea theorem, σ\sigma can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability has to be selected. In this paper, σ\sigma is subjected to two requirements: (i) The resulting sequence (Xn)(X_n) is conditionally identically distributed, in the sense of Berti, Pratelli and Rigo (2004); (ii) Each σn+1\sigma_{n+1} is a simple recursive update of σn\sigma_n. Various new σ\sigma satisfying (i)-(ii) are introduced and investigated. For such σ\sigma, the asymptotics of σn\sigma_n, as nn\rightarrow\infty, is determined. In some cases, the probability distribution of (Xn)(X_n) is also evaluated.

View on arXiv
Comments on this paper