ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.11042
11
12

f-divergences and their applications in lossy compression and bounding generalization error

21 June 2022
Saeed Masiha
A. Gohari
Mohammad Hossein Yassaee
ArXivPDFHTML
Abstract

In this paper, we provide three applications for fff-divergences: (i) we introduce Sanov's upper bound on the tail probability of the sum of independent random variables based on super-modular fff-divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual fff-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular fff-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual fff-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the fff-rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds.

View on arXiv
Comments on this paper