19
66

Approximating smooth functions by deep neural networks with sigmoid activation function

Abstract

We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any dd-dimensional, smooth function on a compact set with a rate of order Wp/dW^{-p/d}, where WW is the number of nonzero weights in the network and pp is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order MdM^d achieve an approximation rate of M2pM^{-2p}. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W0W_0 in the network and show an approximation rate of W0p/dW_0^{-p/d}. This more general result finally helps us to understand which network topology guarantees a special target accuracy.

View on arXiv
Comments on this paper