Equivalence of approximation by convolutional neural networks and
fully-connected networks
Abstract
Convolutional neural networks are the most widely used type of neural networks in applications. In mathematical analysis, however, mostly fully-connected networks are studied. In this paper, we establish a connection between both network architectures. Using this connection, we show that all upper and lower bounds concerning approximation rates of {fully-connected} neural networks for functions ---for an arbitrary function class ---translate to essentially the same bounds on approximation rates of \emph{convolutional} neural networks for functions , with the class consisting of all translation equivariant functions whose first coordinate belongs to .
View on arXivComments on this paper
