117

Simulating Posterior Bayesian Neural Networks with Dependent Weights

Main:21 Pages
2 Figures
Bibliography:3 Pages
Abstract

In this paper we consider posterior Bayesian fully connected and feedforward deep neural networks with dependent weights. Particularly, if the likelihood is Gaussian, we identify the distribution of the wide width limit and provide an algorithm to sample from the network. In the shallow case we explicitly compute the distribution of the output, proving that it is a Gaussian mixture. All the theoretical results are numerically validated.

View on arXiv
Comments on this paper