80
0

Student-t processes as infinite-width limits of posterior Bayesian neural networks

Abstract

The asymptotic properties of Bayesian Neural Networks (BNNs) have been extensively studied, particularly regarding their approximations by Gaussian processes in the infinite-width limit. We extend these results by showing that posterior BNNs can be approximated by Student-t processes, which offer greater flexibility in modeling uncertainty. Specifically, we show that, if the parameters of a BNN follow a Gaussian prior distribution, and the variance of both the last hidden layer and the Gaussian likelihood function follows an Inverse-Gamma prior distribution, then the resulting posterior BNN converges to a Student-t process in the infinite-width limit. Our proof leverages the Wasserstein metric to establish control over the convergence rate of the Student-t process approximation.

View on arXiv
@article{caporali2025_2502.04247,
  title={ Student-t processes as infinite-width limits of posterior Bayesian neural networks },
  author={ Francesco Caporali and Stefano Favaro and Dario Trevisan },
  journal={arXiv preprint arXiv:2502.04247},
  year={ 2025 }
}
Comments on this paper