ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1410.2346
38
4
v1v2v3 (latest)

Minimization Problems Based on Relative ααα-Entropy I: Forward Projection

9 October 2014
Ashok Kumar Moses
R. Sundaresan
ArXiv (abs)PDFHTML
Abstract

Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α\alphaα-entropies (denoted Iα\mathscr{I}_{\alpha}Iα​), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative α\alphaα-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative α\alphaα-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum R\'{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed forward Iα\mathscr{I}_{\alpha}Iα​-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse Iα\mathscr{I}_{\alpha}Iα​-projection is studied.

View on arXiv
Comments on this paper