ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.00288
23
9
v1v2v3 (latest)

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

29 June 2019
T. Nishiyama
ArXiv (abs)PDFHTML
Abstract

In this paper, we derive a useful lower bound for the Kullback-Leibler divergence (KL-divergence) based on the Hammersley-Chapman-Robbins bound (HCRB). The HCRB states that the variance of an estimator is bounded from below by the Chi-square divergence and the expectation value of the estimator. By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose. We show that the equality holds for the Bernoulli distributions and show that the inequality converges to the Cram\'{e}r-Rao bound when two distributions are very close. Furthermore, we describe application examples and examples of numerical calculation.

View on arXiv
Comments on this paper