Generalized Guarantees for Variational Inference in the Presence of Even and Elliptical Symmetry
We extend several recent results providing symmetry-based guarantees for variational inference (VI) with location-scale families. VI approximates a target density by the best match in a family of tractable distributions that in general does not contain . It is known that VI can recover key properties of , such as its mean and correlation matrix, when and exhibit certain symmetries and is found by minimizing the reverse Kullback-Leibler divergence. We extend these guarantees in two important directions. First, we provide symmetry-based guarantees for -divergences, a broad class that includes the reverse and forward Kullback-Leibler divergences and the -divergences. We highlight properties specific to the reverse Kullback-Leibler divergence under which we obtain our strongest guarantees. Second, we obtain further guarantees for VI when the target density exhibits even and elliptical symmetries in some but not all of its coordinates. These partial symmetries arise naturally in Bayesian hierarchical models, where the prior induces a challenging geometry but still possesses axes of symmetry. We illustrate these theoretical results in a number of experimental settings.
View on arXiv