99

Deep Neural Networks with General Activations: Super-Convergence in Sobolev Norms

Main:40 Pages
5 Figures
Bibliography:5 Pages
1 Tables
Abstract

This paper establishes a comprehensive approximation result for deep fully-connected neural networks with commonly-used and general activation functions in Sobolev spaces Wn,W^{n,\infty}, with errors measured in the Wm,pW^{m,p}-norm for m<nm < n and 1p1\le p \le \infty. The derived rates surpass those of classical numerical approximation techniques, such as finite element and spectral methods, exhibiting a phenomenon we refer to as \emph{super-convergence}. Our analysis shows that deep networks with general activations can approximate weak solutions of partial differential equations (PDEs) with superior accuracy compared to traditional numerical methods at the approximation level. Furthermore, this work closes a significant gap in the error-estimation theory for neural-network-based approaches to PDEs, offering a unified theoretical foundation for their use in scientific computing.

View on arXiv
Comments on this paper