331
v1v2v3 (latest)

f-divergences and their applications in lossy compression and bounding generalization error

IEEE Transactions on Information Theory (IEEE Trans. Inf. Theory), 2022
Abstract

In this paper, we provide three applications for ff-divergences: (i) we introduce Sanov's upper bound on the tail probability of the sum of independent random variables based on super-modular ff-divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual ff-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular ff-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual ff-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the ff-rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds.

View on arXiv
Comments on this paper