Asymptotic Theory of a Bayesian Non-Marginal Multiple Testing Procedure Under Possible Model Misspecification

The effect of dependence among multiple hypotheses have recently attracted attention of the statistical community. Researchers realized the importance of exploiting the effect of dependence in multiple testing procedures and the associated measures of errors. In this regard, we have devised a novel Bayesian multiple testing procedure where the inherent dependence structure among the hypotheses are utilized to enhance inference. Since the decisions are obtained jointly as a function of the relevant joint posterior probabilities, we refer to this new method as a Bayesian non-marginal multiple testing procedure. In this article, we investigate the asymptotic properties of the non-marginal procedure. In particular, we show that the asymptotic convergence rates of the multiple testing errors, that is, FDR, FNR are directly associated with the Kullback-Leibler divergence from the true model. The asymptotic results in this article hold quite generally, which not only includes the dependent set-ups, but also when the proposed model is misspecified. We illustrate our asymptotic theory on the multiple hypotheses problem of time-varying covariate selection, under an autoregressive model set-up.
View on arXiv