233

Universal flow approximation with deep residual networks

Abstract

Residual networks (ResNets) are a deep learning architecture with the recursive structure \[x_{k+1} = x_k + R_k(x_k)\] where RkR_k is a neural network and the copying of the input xkx_k is called a skip connection. This structure can be seen as the explicit Euler discretisation of an associated ordinary differential equation. We use this interpretation to show that by simultaneously increasing the number of skip connection as well as the expressivity of the networks RkR_k the flow of an arbitrary right hand side \[f\in L^1\left( I; \mathcal C_b^{0, 1}(\mathbb R^d; \mathbb R^d)\right)\] can be approximated uniformly by deep ReLU ResNets on compact sets. Further, we derive estimates on the number of parameters needed to do this up to a prescribed accuracy under temporal regularity assumptions. Finally, we discuss the possibility of using ResNets for diffeomorphic matching problems and propose some next steps in the theoretical foundation of this approach.

View on arXiv
Comments on this paper