Convergence Analysis of the Data Augmentation Algorithm for Bayesian
Linear Regression with Non-Gaussian Errors
Gaussian errors are sometimes inappropriate in a multivariate linear regression setting because, for example, the data contain outliers. In such situations, it is often assumed that the error density is a scale mixture of multivariate normal densities that takes the form , where is the dimension of the response, is the standard -variate normal density, is an unknown positive definite scale matrix, and is some fixed mixing density. Combining this alternative regression model with a default prior on the unknown parameters results in a highly intractable posterior density. Fortunately, there is a simple data augmentation (DA) algorithm and a corresponding Haar PX-DA algorithm that can be used to explore this posterior. This paper provides conditions (on ) for geometric ergodicity of the Markov chains underlying these Markov chain Monte Carlo (MCMC) algorithms. These results are extremely important from a practical standpoint because geometric ergodicity guarantees the existence of the central limit theorems that form the basis of all the standard methods of calculating valid asymptotic standard errors for MCMC-based estimators. The main result is that, if converges to 0 at the origin at an appropriate rate, and , then the DA and Haar PX-DA Markov chains are both geometrically ergodic. This result is quite far-reaching. For example, it implies the geometric ergodicity of the DA and Haar PX-DA Markov chains whenever is generalized inverse Gaussian, log-normal, inverted gamma (with shape parameter larger than ), or Fr\'{e}chet (with shape parameter larger than ).
View on arXiv