A probabilistic view on predictive constructions for Bayesian learning

Given a sequence of random observations, a Bayesian forecaster aims to predict based on for each . To this end, in principle, she only needs to select a collection , called ``strategy" in what follows, where is the marginal distribution of and the -th predictive distribution. Because of the Ionescu-Tulcea theorem, can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability is to be selected. In a nutshell, this is the predictive approach to Bayesian learning. A concise review of the latter is provided in this paper. We try to put such an approach in the right framework, to make clear a few misunderstandings, and to provide a unifying view. Some recent results are discussed as well. In addition, some new strategies are introduced and the corresponding distribution of the data sequence is determined. The strategies concern generalized P\ólya urns, random change points, covariates and stationary sequences.
View on arXiv