The generalized IFS Bayesian method and an associated variational principle covering the classical and dynamical cases

We introduce a general IFS Bayesian method for getting posterior probabilities from prior probabilities, and also a generalized Bayes' rule, which will contemplate a dynamical, as well as a non dynamical setting. Given a loss function , we detail the prior and posterior items, their consequences and exhibit several examples. Taking as the set of parameters and as the set of data (which usually provides {random samples}), a general IFS is a measurable map , which can be interpreted as a family of maps . The main inspiration for the results we will get here comes from a paper by Zellner (with no dynamics), where Bayes' rule is related to a principle of minimization of {information.} We will show that our IFS Bayesian method which produces posterior probabilities (which are associated to holonomic probabilities) is related to the optimal solution of a variational principle, somehow corresponding to the pressure in Thermodynamic Formalism, and also to the principle of minimization of information in Information Theory.
View on arXiv