142

Gaussian AutoEncoder

Abstract

Generative AutoEncoders require a chosen probability distribution for latent variables, usually multivariate Gaussian. The original Variational AutoEncoder (VAE) uses randomness in encoder - causing problematic distortion and overlaps for distinct inputs in latent space. It turned out unnecessary: instead we can use deterministic encoder with additional regularizer to ensure that sample distribution in latent space is close to the required. The original approach (WAE) uses Wasserstein metric, what requires comparing with random sample and using an arbitrarily chosen kernel. Later CWAE finally derived a non-random analytic formula by averaging L2L_2 distance of Gaussian-smoothened sample over all 1D projections. However, these arbitrarily chosen regularizers do not lead to Gaussian distribution. There is proposed approach for regularizer directly optimizing empirical distribution function for radii and distances to agree with CDF of Gaussian (also satisfying other tests) - to directly attract this distribution (or some other chosen) in latent space of AutoEncoder.

View on arXiv
Comments on this paper