210
v1v2 (latest)

Error bounds for deep ReLU networks using the Kolmogorov--Arnold superposition theorem

Neural Networks (NN), 2019
Abstract

We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov--Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.

View on arXiv
Comments on this paper