253
v1v2v3v4 (latest)

Nonclosedness of Sets of Neural Networks in Sobolev Spaces

Abstract

We examine the closedness of sets of realized neural networks of a fixed architecture in Sobolev spaces. For an exactly mm-times differentiable activation function ρ\rho, we construct a sequence of neural networks (Φn)nN(\Phi_n)_{n \in \mathbb{N}} whose realizations converge in order-(m1)(m-1) Sobolev norm to a function that cannot be realized exactly by a neural network. Thus, sets of realized neural networks are not closed in order-(m1)(m-1) Sobolev spaces Wm1,pW^{m-1,p} for p[1,]p \in [1,\infty]. We further show that these sets are not closed in Wm,pW^{m,p} under slightly stronger conditions on the mm-th derivative of ρ\rho. For a real analytic activation function, we show that sets of realized neural networks are not closed in Wk,pW^{k,p} for any kNk \in \mathbb{N}. The nonclosedness allows for approximation of non-network target functions with unbounded parameter growth. We partially characterize the rate of parameter growth for most activation functions by showing that a specific sequence of realized neural networks can approximate the activation function's derivative with weights increasing inversely proportional to the LpL^p approximation error. Finally, we present experimental results showing that networks are capable of closely approximating non-network target functions with increasing parameters via training.

View on arXiv
Comments on this paper