The Sample Complexity of Simple Binary Hypothesis Testing

The sample complexity of simple binary hypothesis testing is the smallest number of i.i.d.\ samples required to distinguish between two distributions and in either: (i) the prior-free setting, with type-I error at most and type-II error at most ; or (ii) the Bayesian setting, with Bayes error at most and prior distribution . This problem has only been studied when (prior-free) or (Bayesian), and the sample complexity is known to be characterized by the Hellinger divergence between and , up to multiplicative constants. In this paper, we derive a formula that characterizes the sample complexity (up to multiplicative constants that are independent of , , and all error parameters) for: (i) all in the prior-free setting; and (ii) all in the Bayesian setting. In particular, the formula admits equivalent expressions in terms of certain divergences from the Jensen--Shannon and Hellinger families. The main technical result concerns an -divergence inequality between members of the Jensen--Shannon and Hellinger families, which is proved by a combination of information-theoretic tools and case-by-case analyses. We explore applications of our results to (i) robust hypothesis testing, (ii) distributed (locally-private and communication-constrained) hypothesis testing, (iii) sequential hypothesis testing, and (iv) hypothesis testing with erasures.
View on arXiv@article{pensia2025_2403.16981, title={ The Sample Complexity of Simple Binary Hypothesis Testing }, author={ Ankit Pensia and Varun Jog and Po-Ling Loh }, journal={arXiv preprint arXiv:2403.16981}, year={ 2025 } }