146
v1v2 (latest)

Exponential Expressivity of ReLUk^k Neural Networks on Gevrey Classes with Point Singularities

Abstract

We analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains DRd\mathrm{D} \subset \mathbb{R}^d, d=2,3d=2,3. We prove exponential emulation rates in Sobolev spaces in terms of the number of neurons and in terms of the number of nonzero coefficients for Gevrey-regular solution classes defined in terms of weighted Sobolev scales in D\mathrm{D}, comprising the countably-normed spaces of I.M. Babu\v{s}ka and B.Q. Guo. As intermediate result, we prove that continuous, piecewise polynomial high order (``pp-version'') finite elements with elementwise polynomial degree pNp\in\mathbb{N} on arbitrary, regular, simplicial partitions of polyhedral domains DRd\mathrm{D} \subset \mathbb{R}^d, d2d\geq 2 can be exactly emulated by neural networks combining ReLU and ReLU2^2 activations. On shape-regular, simplicial partitions of polytopal domains D\mathrm{D}, both the number of neurons and the number of nonzero parameters are proportional to the number of degrees of freedom of the finite element space, in particular for the hphp-Finite Element Method of I.M. Babu\v{s}ka and B.Q. Guo.

View on arXiv
Comments on this paper