Tight Bounds on the Binomial CDF, and the Minimum of i.i.d Binomials, in terms of KL-Divergence

Abstract
We provide finite sample upper and lower bounds on the Binomial tail probability which are a direct application of Sanov's theorem. We then use these to obtain high probability upper and lower bounds on the minimum of i.i.d. Binomial random variables. Both bounds are finite sample, asymptotically tight, and expressed in terms of the KL-divergence.
View on arXiv@article{zhu2025_2502.18611, title={ Tight Bounds on the Binomial CDF, and the Minimum of i.i.d Binomials, in terms of KL-Divergence }, author={ Xiaohan Zhu and Mesrob I. Ohannessian and Nathan Srebro }, journal={arXiv preprint arXiv:2502.18611}, year={ 2025 } }
Comments on this paper