The power-conditional-expected-posterior (PCEP) prior developed for variable selection in normal regression models combines ideas from the power-prior and expected-posterior prior, relying on the concept of random imaginary data, and provides a consistent variable selection method which leads to parsimonious inference. In this paper we discuss the computational limitations of applying the PCEP prior to generalized linear models (GLMs) and present two PCEP prior variations which are easily applicable to regression models belonging to the exponential family of distributions. We highlight the differences between the initial PCEP prior and the two GLM-based PCEP prior adaptations and compare their properties in the conjugate case of the normal linear regression model. Hyper prior extensions for the PCEP power parameter are further considered. We consider several simulation scenarios and one real data example for evaluating the performance of the proposed methods compared to other commonly used methods. Empirical results indicate that the two GLM-PCEP adaptations lead to parsimonious variable selection inference.
View on arXiv