Fast approximations of the Jeffreys divergence between univariate
Gaussian mixture models via exponential polynomial densities
The Jeffreys divergence is a renown symmetrization of the statistical Kullback-Leibler divergence which is often used in machine learning, signal processing, and information sciences. Since the Jeffreys divergence between the ubiquitous Gaussian Mixture Models are not available in closed-form, many techniques with various pros and cons have been proposed in the literature to either (i) estimate, (ii) approximate, or (iii) lower and upper bound this divergence. In this work, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two GMMs of arbitrary number of components. The heuristic relies on converting GMMs into pairs of dually parameterized probability densities belonging to exponential families. In particular, we consider Polynomial Exponential Densities, and design a goodness-of-fit criterion to measure the dissimilarity between a GMM and a PED which is a generalization of the Hyv\"arinen divergence. This criterion allows one to select the orders of the PEDs to approximate the GMMs. We demonstrate experimentally that the computational time of our heuristic improves over the stochastic Monte Carlo estimation baseline by several orders of magnitude while approximating reasonably well the Jeffreys divergence, specially when the univariate mixtures have a small number of modes.
View on arXiv