17
3

Expressivity and Approximation Properties of Deep Neural Networks with ReLUk^k Activation

Abstract

In this paper, we investigate the expressivity and approximation properties of deep neural networks employing the ReLUk^k activation function for k2k \geq 2. Although deep ReLU networks can approximate polynomials effectively, deep ReLUk^k networks have the capability to represent higher-degree polynomials precisely. Our initial contribution is a comprehensive, constructive proof for polynomial representation using deep ReLUk^k networks. This allows us to establish an upper bound on both the size and count of network parameters. Consequently, we are able to demonstrate a suboptimal approximation rate for functions from Sobolev spaces as well as for analytic functions. Additionally, through an exploration of the representation power of deep ReLUk^k networks for shallow networks, we reveal that deep ReLUk^k networks can approximate functions from a range of variation spaces, extending beyond those generated solely by the ReLUk^k activation function. This finding demonstrates the adaptability of deep ReLUk^k networks in approximating functions within various variation spaces.

View on arXiv
Comments on this paper