31
2

The generalized IFS Bayesian method and an associated variational principle covering the classical and dynamical cases

Abstract

We introduce a general IFS Bayesian method for getting posterior probabilities from prior probabilities, and also a generalized Bayes' rule, which will contemplate a dynamical, as well as a non dynamical setting. Given a loss function l{l}, we detail the prior and posterior items, their consequences and exhibit several examples. Taking Θ\Theta as the set of parameters and YY as the set of data (which usually provides {random samples}), a general IFS is a measurable map τ:Θ×YY\tau:\Theta\times Y \to Y, which can be interpreted as a family of maps τθ:YY,θΘ\tau_\theta:Y\to Y,\,\theta\in\Theta. The main inspiration for the results we will get here comes from a paper by Zellner (with no dynamics), where Bayes' rule is related to a principle of minimization of {information.} We will show that our IFS Bayesian method which produces posterior probabilities (which are associated to holonomic probabilities) is related to the optimal solution of a variational principle, somehow corresponding to the pressure in Thermodynamic Formalism, and also to the principle of minimization of information in Information Theory.

View on arXiv
Comments on this paper