61
11

On ww-mixtures: Finite convex combinations of prescribed component distributions

Abstract

We consider the space of ww-mixtures which is defined as the set of finite statistical mixtures sharing the same prescribed component distributions closed under convex combinations. The information geometry induced by the Bregman generator set to the Shannon negentropy on this space yields a dually flat space called the mixture family manifold. We show how the Kullback-Leibler (KL) divergence can be recovered from the corresponding Bregman divergence for the negentropy generator: That is, the KL divergence between two ww-mixtures amounts to a Bregman Divergence (BD) induced by the Shannon negentropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same Gaussian components is equivalent to a Bregman divergence. This KL-BD equivalence on a mixture family manifold implies that we can perform optimal KL-averaging aggregation of ww-mixtures without information loss. More generally, we prove that the statistical skew Jensen-Shannon divergence between ww-mixtures is equivalent to a skew Jensen divergence between their corresponding parameters. Finally, we state several properties, divergence identities, and inequalities relating to ww-mixtures.

View on arXiv
Comments on this paper