ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.16727
104
0

An Information-Theoretic Regularizer for Lossy Neural Image Compression

23 November 2024
Y. Zhang
Meng Wang
Xihua Sheng
Peilin Chen
Junru Li
L. Zhang
S. Wang
ArXivPDFHTML
Abstract

Lossy image compression networks aim to minimize the latent entropy of images while adhering to specific distortion constraints. However, optimizing the neural network can be challenging due to its nature of learning quantized latent representations. In this paper, our key finding is that minimizing the latent entropy is, to some extent, equivalent to maximizing the conditional source entropy, an insight that is deeply rooted in information-theoretic equalities. Building on this insight, we propose a novel structural regularization method for the neural image compression task by incorporating the negative conditional source entropy into the training objective, such that both the optimization efficacy and the model's generalization ability can be promoted. The proposed information-theoretic regularizer is interpretable, plug-and-play, and imposes no inference overheads. Extensive experiments demonstrate its superiority in regularizing the models and further squeezing bits from the latent representation across various compression structures and unseen domains.

View on arXiv
@article{zhang2025_2411.16727,
  title={ An Information-Theoretic Regularizer for Lossy Neural Image Compression },
  author={ Yingwen Zhang and Meng Wang and Xihua Sheng and Peilin Chen and Junru Li and Li Zhang and Shiqi Wang },
  journal={arXiv preprint arXiv:2411.16727},
  year={ 2025 }
}
Comments on this paper