ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.03266
54
25
v1v2 (latest)

ααα-Variational Inference with Statistical Guarantees

9 October 2017
Yun Yang
D. Pati
A. Bhattacharya
ArXiv (abs)PDFHTML
Abstract

We propose a variational approximation to Bayesian posterior distributions, called α\alphaα-VB, with provable statistical guarantees for models with and without latent variables. The standard variational approximation is a special case of α\alphaα-VB with α=1\alpha=1α=1. When α∈(0,1)\alpha \in(0,1)α∈(0,1), a novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the α\alphaα-VB procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field variational approximation to (low)-high-dimensional Bayesian linear regression with spike and slab priors, mixture of Gaussian models, latent Dirichlet allocation, and (mixture of) Gaussian variational approximation in regular parametric models.

View on arXiv
Comments on this paper