ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.15130
81
1

Unnormalized Variational Bayes

29 July 2020
Saeed Saremi
    BDL
ArXivPDFHTML
Abstract

We unify empirical Bayes and variational Bayes for approximating unnormalized densities. This framework, named unnormalized variational Bayes (UVB), is based on formulating a latent variable model for the random variable Y=X+N(0,σ2Id)Y=X+N(0,\sigma^2 I_d)Y=X+N(0,σ2Id​) and using the evidence lower bound (ELBO), computed by a variational autoencoder, as a parametrization of the energy function of YYY which is then used to estimate XXX with the empirical Bayes least-squares estimator. In this intriguing setup, the gradient\textit{gradient}gradient of the ELBO with respect to noisy inputs plays the central role in learning the energy function. Empirically, we demonstrate that UVB has a higher capacity to approximate energy functions than the parametrization with MLPs as done in neural empirical Bayes (DEEN). We especially showcase σ=1\sigma=1σ=1, where the differences between UVB and DEEN become visible and qualitative in the denoising experiments. For this high level of noise, the distribution of YYY is very smoothed and we demonstrate that one can traverse in a single run −-− without a restart −-− all MNIST classes in a variety of styles via walk-jump sampling with a fast-mixing Langevin MCMC sampler. We finish by probing the encoder/decoder of the trained models and confirm UVB ≠\neq= VAE.

View on arXiv
Comments on this paper