16
4

Investigation of Alternative Measures for Mutual Information

Abstract

Mutual information I(X;Y)I(X;Y) is a useful definition in information theory to estimate how much information the random variable YY holds about the random variable XX. One way to define the mutual information is by comparing the joint distribution of XX and YY with the product of the marginals through the KL-divergence. If the two distributions are close to each other there will be almost no leakage of XX from YY since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits YY reveals about XX and if I(X;Y)=H(X)I(X;Y)=H(X) (the Shannon entropy of XX) then XX is completely revealed. However, in the continuous case we do not have the same reasoning. For instance the mutual information can be infinite in the continuous case. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics or divergences such as Kullback-Liebler (KL) divergence, Wasserstein distance, Jensen-Shannon divergence and total variation distance to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.

View on arXiv
Comments on this paper