357

Quasi-symplectic Langevin Variational Autoencoder

Abstract

Variational autoencoder (VAE) is a very popular and well-investigated generative model vastly used in neural learning research. To leverage VAE in practical tasks dealing with a massive dataset of large dimensions it is required to deal with the difficulty of building low variance evidence lower bounds (ELBO). Markov ChainMonte Carlo (MCMC) is one of the effective approaches to tighten the ELBO for approximating the posterior distribution. Hamiltonian Variational Autoencoder(HVAE) is an effective MCMC inspired approach for constructing a low-variance ELBO which is also amenable to the reparameterization trick. In this work, we propose a Quasi-symplectic Langevin Variational autoencoder (Langevin-VAE) by incorporating the gradients information in the inference process through the Langevin dynamic. We show the effectiveness of the proposed approach by toy and real-world examples.

View on arXiv
Comments on this paper