ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1510.01844
310
14
v1v2v3v4 (latest)

Linear Bounds between Contraction Coefficients for fff-Divergences

7 October 2015
A. Makur
Lizhong Zheng
ArXiv (abs)PDFHTML
Abstract

Data processing inequalities for fff-divergences can be sharpened using constants called "contraction coefficients" to produce strong data processing inequalities. For any discrete source-channel pair, the contraction coefficients for fff-divergences are lower bounded by the contraction coefficient for χ2\chi^2χ2-divergence. In this paper, we elucidate that this lower bound can be achieved by driving the input fff-divergences of the contraction coefficients to zero. Then, we establish a linear upper bound on the contraction coefficients for a certain class of fff-divergences using the contraction coefficient for χ2\chi^2χ2-divergence, and refine this upper bound for the salient special case of Kullback-Leibler (KL) divergence. Furthermore, we present an alternative proof of the fact that the contraction coefficients for KL and χ2\chi^2χ2-divergences are equal for a Gaussian source with an additive Gaussian noise channel (where the former coefficient can be power constrained). Finally, we generalize the well-known result that contraction coefficients of channels (after extremizing over all possible sources) for all fff-divergences with non-linear operator convex fff are equal. In particular, we prove that the so called "less noisy" preorder over channels can be equivalently characterized by any non-linear operator convex fff-divergence.

View on arXiv
Comments on this paper