117
53

Asymptotic Properties of Bayes Risk of a General Class of Normal Scale Mixture Priors Under Sparsity

Abstract

Consider the problem of simultaneous testing for the means of independent normal observations. In this paper, we study asymptotic optimality properties of certain multiple testing rules in a Bayesian decision theoretic framework, where the overall loss of a multiple testing rule is taken as the number of misclassified hypotheses. We assume a two-groups normal mixture model for the data. We consider the same asymptotic framework adopted in Bogdan, Chakrabarti, Frommlet and Ghosh (2011) in this two-groups formulation and compare the risk of the multiple testing rules under study with that of the Bayes Oracle introduced in that paper. The multiple testing rules we consider are induced by a general class of global-local shrinkage priors for the mean parameter. This class of shrinkage priors is rich enough to include, among others, the horseshoe, the Hypergeometric Inverted-Beta, the generalized Double Pareto, the Three Parameter Beta and the Inverse-Gamma priors. We etablish that within our chosen asymptotic framework, the multiple testing rules under study asymptotically attain the risk of the Bayes Oracle upto a factor of O(1)O(1) with the constant in the risk close to the constant in the Bayes Oracle. This is similar to a result obtained in Datta and Ghosh (2013) for the multple testing rule based on the horseshoe estimator introduced in Carvalho, Polson and Scott (2009,2010). We have a unifying argument applicable for the general class of priors under study. In the process, we settle a conjecture regarding optimality property of the generalized Double Pareto priors made in Datta and Ghosh (2013). Our work also shows that the result in Datta and Ghosh (2013) can be improved.

View on arXiv
Comments on this paper