68

Superiority of GNN over NN in generalizing bandlimited functions

Information and Inference A Journal of the IMA (JIII), 2022
Abstract

We constructively show, via rigorous mathematical arguments, that GNN architectures outperform those of NN in approximating bandlimited functions on compact dd-dimensional Euclidean grids. We show that the former only needs M\mathcal{M} sampled functional values to achieve a uniform approximation error of Od(exp(cM1/d))O_{d}(\exp(-c\mathcal{M}^{1/d})) and that this error rate is optimal, in the sense that, NNs might achieve worse. On the theoretical side, our work demonstrates that ideas from sampling theory can be effectively used in analyzing the expressive capability of neural networks.

View on arXiv
Comments on this paper