A Large Deviation Approach to Posterior Consistency in Dynamical Systems

In this paper, we provide asymptotic results concerning (generalized) Bayesian inference for certain dynamical systems based on a large deviation approach. Given a sequence of observations , a class of model processes parameterized by which can be characterized as a stochastic process or a measure , and a loss function which measures the error between and a realization of , we specify the generalized posterior distribution . The goal of this paper is to study the asymptotic behavior of as In particular, we state conditions on the model family and the loss function such that the posterior distribution converges. The two conditions we require are: (1) a conditional large deviation behavior for a single , and (2) an exponential continuity condition over the model family for the map from the parameter to the loss incurred between and the observation sequence . The proposed framework is quite general, we apply it to two very different classes of dynamical systems: continuous time hypermixing processes and Gibbs processes on shifts of finite type. We also show that the generalized posterior distribution concentrates asymptotically on those parameters that minimize the expected loss and a divergence term, hence proving posterior consistency.
View on arXiv