14
0

Wide stable neural networks: Sample regularity, functional convergence and Bayesian inverse problems

Tomás Soto
Abstract

We study the large-width asymptotics of random fully connected neural networks with weights drawn from α\alpha-stable distributions, a family of heavy-tailed distributions arising as the limiting distributions in the Gnedenko-Kolmogorov heavy-tailed central limit theorem. We show that in an arbitrary bounded Euclidean domain U\mathcal{U} with smooth boundary, the random field at the infinite-width limit, characterized in previous literature in terms of finite-dimensional distributions, has sample functions in the fractional Sobolev-Slobodeckij-type quasi-Banach function space Ws,p(U)W^{s,p}(\mathcal{U}) for integrability indices p<αp < \alpha and suitable smoothness indices ss depending on the activation function of the neural network, and establish the functional convergence of the processes in P(Ws,p(U))\mathcal{P}(W^{s,p}(\mathcal{U})). This convergence result is leveraged in the study of functional posteriors for edge-preserving Bayesian inverse problems with stable neural network priors.

View on arXiv
Comments on this paper