218

Towards universal neural nets: Gibbs machines and ACE

Abstract

We study a class of neural nets - Gibbs machines - which are a type of variational auto-encoders, designed for gradual learning. They offer an universal platform for incrementally adding newly learned features, including physical symmetries in space/time. Combining them with classifiers gives rise to a brand of universal generative neural nets - stochastic auto-classifier-encoders (ACE). ACE preserve the non-Gaussian and clustering nature of real-life data and have state-of-the-art performance, both for classification and density estimation for the MNIST data set.

View on arXiv
Comments on this paper