ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.15902
14
67

Variational inference via Wasserstein gradient flows

31 May 2022
Marc Lambert
Sinho Chewi
Francis R. Bach
Silvère Bonnabel
Philippe Rigollet
    BDL
    DRL
ArXivPDFHTML
Abstract

Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior π\piπ, VI aims at producing a simple but effective approximation π^\hat \piπ^ to π\piπ for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, algorithmic guarantees for VI are still relatively less well-understood. In this work, we propose principled methods for VI, in which π^\hat \piπ^ is taken to be a Gaussian or a mixture of Gaussians, which rest upon the theory of gradient flows on the Bures--Wasserstein space of Gaussian measures. Akin to MCMC, it comes with strong theoretical guarantees when π\piπ is log-concave.

View on arXiv
Comments on this paper