Given an intractable distribution , the problem of variational inference (VI) is to find the best approximation from some more tractable family . Commonly, one chooses to be a family of factorized distributions (i.e., the mean-field assumption), even though~ itself does not factorize. We show that this mismatch leads to an impossibility theorem: if does not factorize, then any factorized approximation can correctly estimate at most one of the following three measures of uncertainty: (i) the marginal variances, (ii) the marginal precisions, or (iii) the generalized variance (which can be related to the entropy). In practice, the best variational approximation in is found by minimizing some divergence between distributions, and so we ask: how does the choice of divergence determine which measure of uncertainty, if any, is correctly estimated by VI? We consider the classic Kullback-Leibler divergences, the more general -divergences, and a score-based divergence which compares and . We provide a thorough theoretical analysis in the setting where is a Gaussian and is a (factorized) Gaussian. We show that all the considered divergences can be \textit{ordered} based on the estimates of uncertainty they yield as objective functions for~VI. Finally, we empirically evaluate the validity of this ordering when the target distribution is not Gaussian.
View on arXiv@article{margossian2025_2403.13748, title={ Variational Inference for Uncertainty Quantification: an Analysis of Trade-offs }, author={ Charles C. Margossian and Loucas Pillaud-Vivien and Lawrence K. Saul }, journal={arXiv preprint arXiv:2403.13748}, year={ 2025 } }