On Sparse variational methods and the Kullback-Leibler divergence between stochastic processes

Abstract
The variational framework for learning inducing variables Titsias (2009) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximate and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. Many of the technical requirements for such a result were derived in the pioneering work of Seeger (2003,2003b). In this work we give a relatively gentle and largely self-contained explanation of the result. The result is important in understanding the variational inducing framework and could lead to principled novel generalizations.
View on arXivComments on this paper