948
v1v2v3v4v5v6v7v8 (latest)

Deep Network Approximation for Smooth Functions

SIAM Journal on Mathematical Analysis (SIAM J. Math. Anal.), 2020
Abstract

This paper establishes the (nearly) optimal approximation error characterization of deep rectified linear unit (ReLU) networks for smooth functions in terms of both width and depth simultaneously. To that end, we first prove that multivariate polynomials can be approximated by deep ReLU networks of width O(N)\mathcal{O}(N) and depth O(L)\mathcal{O}(L) with an approximation error O(NL)\mathcal{O}(N^{-L}). Through local Taylor expansions and their deep ReLU network approximations, we show that deep ReLU networks of width O(NlnN)\mathcal{O}(N\ln N) and depth O(LlnL)\mathcal{O}(L\ln L) can approximate fCs([0,1]d)f\in C^s([0,1]^d) with a nearly optimal approximation error O(fCs([0,1]d)N2s/dL2s/d)\mathcal{O}(\|f\|_{C^s([0,1]^d)}N^{-2s/d}L^{-2s/d}). Our estimate is non-asymptotic in the sense that it is valid for arbitrary width and depth specified by NN+N\in\mathbb{N}^+ and LN+L\in\mathbb{N}^+, respectively.

View on arXiv
Comments on this paper