A Neural Difference-of-Entropies Estimator for Mutual Information

Abstract
Estimating Mutual Information (MI), a key measure of dependence of random quantities without specific modelling assumptions, is a challenging problem in high dimensions. We propose a novel mutual information estimator based on parametrizing conditional densities using normalizing flows, a deep generative model that has gained popularity in recent years. This estimator leverages a block autoregressive structure to achieve improved bias-variance trade-offs on standard benchmark tasks.
View on arXiv@article{ni2025_2502.13085, title={ A Neural Difference-of-Entropies Estimator for Mutual Information }, author={ Haoran Ni and Martin Lotz }, journal={arXiv preprint arXiv:2502.13085}, year={ 2025 } }
Comments on this paper