Sequential Monte Carlo approximations of Wasserstein--Fisher--Rao gradient flows
We consider the problem of sampling from a probability distribution . It is well known that this can be written as an optimisation problem over the space of probability distribution in which we aim to minimise the Kullback--Leibler divergence from . We consider several partial differential equations (PDEs) whose solution is a minimiser of the Kullback--Leibler divergence from and connect them to well-known Monte Carlo algorithms. We focus in particular on PDEs obtained by considering the Wasserstein--Fisher--Rao geometry over the space of probabilities and show that these lead to a natural implementation using importance sampling and sequential Monte Carlo. We propose a novel algorithm to approximate the Wasserstein--Fisher--Rao flow of the Kullback--Leibler divergence and conduct an extensive empirical study to identify when these algorithms outperforms other popular Monte Carlo algorithms.
View on arXiv