Bayesian Ultrahigh-Dimensional Screening Via MCMC

We explore the theoretical and numerical property of a fully Bayesian model selection method in sparse ultrahigh-dimensional settings, i.e., , where is the number of covariates and is the sample size. Our method consists of (1) a hierarchical Bayesian model with a novel prior placed over the model space which includes a hyperparameter controlling the model size, and (2) an efficient MCMC algorithm for automatic and stochastic search of the models. Our theory shows that, when specifying correctly, the proposed method yields selection consistency, i.e., the posterior probability of the true model asymptotically approaches one; when is misspecified, the selected model is still asymptotically nested in the true model. The theory also reveals insensitivity of the selection result with respect to the choice of . In implementations, a reasonable prior is further assumed on which allows us to draw its samples stochastically. Our approach conducts selection, estimation and even inference in a unified framework. No additional prescreening or dimension reduction step is needed. Two novel -priors are proposed to make our approach more flexible. A simulation study is given to display the numerical advantage of our method.
View on arXiv