On -mixtures: Finite convex combinations of prescribed component distributions
- CoGe

We consider the space of -mixtures that is the set of finite statistical mixtures sharing the same prescribed component distributions. The geometry induced by the Kullback-Leibler (KL) divergence on this family of -mixtures is a dually flat space in information geometry called the mixture family manifold. It follows that the KL divergence between two -mixtures is equivalent to a Bregman Divergence (BD) defined for the negative Shannon entropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same components is (theoretically) a Bregman divergence. This KL-BD equivalence implies that we can perform optimal KL-averaging aggregation of -mixtures without information loss. More generally, we prove that the skew Jensen-Shannon divergence between -mixtures is equivalent to a skew Jensen divergence on their parameters. Finally, we state several divergence identity and inequalities relating -mixtures.
View on arXiv