Asymptotic Properties of Bayes Risk of a General Class of Shrinkage Priors in Multiple Hypothesis Testing Under Sparsity

Consider the problem of simultaneous testing for the means of independent normal observations. In this paper, we study asymptotic optimality properties of certain multiple testing rules in a Bayesian decision theoretic framework, where the overall loss of a multiple testing rule is taken as the number of misclassified hypotheses. We assume a two-groups normal mixture model for the data and consider the asymptotic framework adopted in \citet{BCFG2011}. We compare the risk of the multiple testing rules under study with that of the Bayes Oracle considered in that paper. The multiple testing rules we study are induced by a general class of one-group shrinkage priors for the mean parameter. This class of shrinkage priors is rich enough to include, among others, the families of Three Parameter Beta, generalized double Pareto priors, and in particular the horseshoe, the normal-exponential-gamma and the Strawderman-Berger priors. We etablish that within our chosen asymptotic framework, the multiple testing rules under study asymptotically attain the risk of the Bayes Oracle upto a factor of with the constant in the risk close to the constant in the Bayes Oracle. This is similar to a result obtained in \citet{DG2013} for the multple testing rule based on the horseshoe estimator introduced in \citet{CPS2009, CPS2010}. We have a unifying argument applicable for the general class of priors under study. In the process, we settle a conjecture regarding optimality property of the generalized double Pareto priors made in \citet{DG2013}. Our work also shows that the result in \citet{DG2013} can be improved.
View on arXiv