In this paper, we derive a useful lower bound for the Kullback-Leibler divergence (KL-divergence) based on the Hammersley-Chapman-Robbins bound (HCRB). The HCRB states that the variance of an estimator is bounded from below by the Chi-square divergence and the expectation value of the estimator. By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose. We show that the equality holds for the Bernoulli distributions and show that the inequality converges to the Cram\'{e}r-Rao bound when two distributions are very close. Furthermore, we describe application examples and examples of numerical calculation.
View on arXiv