Qualitative Robustness of Bayesian Inference
We develop a framework for quantifying the sensitivity of the distribution of posterior distributions with respect to perturbations of the prior and data generating distributions in the limit when the number of data points grows towards infinity. In this generalization of Hampel's notion of qualitative robustness to Bayesian inference, posterior distributions are analyzed as measure-valued random variables (measures randomized through the data) and their robustness is quantified using the total variation, Prokhorov, and Ky Fan metrics. Our results show that (1) the assumption that the prior has Kullback-Leibler support at the parameter value generating the data, classically used to prove consistency, can also be used to prove the non-robustness of posterior distributions with respect to infinitesimal perturbations (in total variation metric) of the class of priors satisfying that assumption, (2) for a prior which has global Kullback-Leibler support on a space which is not totally bounded, we can establish non qualitative robustness and (3) consistency and robustness are, to some degree, antagonistic requirements and a careful selection of the prior is important if both properties (or their approximations) are to be achieved. The mechanisms supporting our results also indicate that misspecification generates non qualitative robustness.
View on arXiv