ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.01061
86
5
v1v2v3 (latest)

Information Geometry and Classical Cramér-Rao Type Inequalities

Handbook of Statistics (HS), 2021
2 April 2021
Kumar Vijay Mishra
M. I. M. Ashok Kumar
ArXiv (abs)PDFHTML
Abstract

We examine the role of information geometry in the context of classical Cram\'er-Rao (CR) type inequalities. In particular, we focus on Eguchi's theory of obtaining dualistic geometric structures from a divergence function and then applying Amari-Nagoaka's theory to obtain a CR type inequality. The classical deterministic CR inequality is derived from Kullback-Leibler (KL)-divergence. We show that this framework could be generalized to other CR type inequalities through four examples: α\alphaα-version of CR inequality, generalized CR inequality, Bayesian CR inequality, and Bayesian α\alphaα-CR inequality. These are obtained from, respectively, IαI_\alphaIα​-divergence (or relative α\alphaα-entropy), generalized Csisz\'ar divergence, Bayesian KL divergence, and Bayesian IαI_\alphaIα​-divergence.

View on arXiv
Comments on this paper