Measuring multivariate redundant information with pointwise common
change in surprisal
- SSL
The problem of how to properly quantify redundant information is an open question that has been the subject of much recent research. Redundant information refers to information about a target variable S that is common to two or more predictor variables Xi. It can be thought of as quantifying overlapping information content or similarities in the representation of S between the Xi. We present a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level. We demonstrate how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information for a range of example systems, including continuous Gaussian variables. We also propose a modification of the PID in which we normalise partial information terms from non-disjoint sets of sources within the same level of the redundancy lattice, to prevent negative terms resulting from over-counting dependent partial information values. Our redundancy measure is easy to compute, and Matlab code implementing the measure, together with all considered examples, is provided.
View on arXiv