93
13
v1v2v3v4 (latest)

Linear Bounds between Contraction Coefficients for ff-Divergences

Abstract

Data processing inequalities for ff-divergences can be sharpened using constants called "contraction coefficients" to produce strong data processing inequalities. For any discrete source-channel pair, the contraction coefficients for ff-divergences are lower bounded by the contraction coefficient for χ2\chi^2-divergence. In this paper, we elucidate that this lower bound can be achieved by driving the input ff-divergences of the contraction coefficients to zero. Then, we establish a linear upper bound on the contraction coefficients for a certain class of ff-divergences using the contraction coefficient for χ2\chi^2-divergence, and refine this upper bound for the salient special case of Kullback-Leibler (KL) divergence. Furthermore, we present an alternative proof of the fact that the contraction coefficients for KL and χ2\chi^2-divergences are equal for a Gaussian source with an additive Gaussian noise channel (where the former coefficient can be power constrained). Finally, we generalize the well-known result that contraction coefficients of channels (after extremizing over all possible sources) for all ff-divergences with non-linear operator convex ff are equal. In particular, we prove that the so called "less noisy" preorder over channels can be equivalently characterized by any non-linear operator convex ff-divergence.

View on arXiv
Comments on this paper