399

Gibbs sampler and coordinate ascent variational inference: a set-theoretical review

Abstract

A central task in Bayesian machine learning is the approximation of the posterior distribution. Gibbs sampler and coordinate ascent variational inference are commonly used approximation schemes that rely on stochastic and deterministic approximations. This article clarifies that the two schemes can be explained more generally in a set-theoretical point of view. This is an immediate consequence of a duality formula for variational inference.

View on arXiv
Comments on this paper