67
11

On ww-mixtures: Finite convex combinations of prescribed component distributions

Abstract

We consider the space of ww-mixtures that is the set of finite statistical mixtures sharing the same prescribed component distributions. The geometry induced by the Kullback-Leibler (KL) divergence on this family of ww-mixtures is a dually flat space in information geometry called the mixture family manifold. It follows that the KL divergence between two ww-mixtures is equivalent to a Bregman Divergence (BD) defined for the negative Shannon entropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same components is (theoretically) a Bregman divergence. This KL-BD equivalence implies that we can perform optimal KL-averaging aggregation of ww-mixtures without information loss. More generally, we prove that the skew Jensen-Shannon divergence between ww-mixtures is equivalent to a skew Jensen divergence on their parameters. Finally, we state several divergence identity and inequalities relating ww-mixtures.

View on arXiv
Comments on this paper