Deep Network Approximation: Beyond ReLU to Diverse Activation Functions
Journal of machine learning research (JMLR), 2023
Abstract
This paper explores the expressive power of deep neural networks for a diverse range of activation functions. An activation function set is defined to encompass the majority of commonly used activation functions, such as , , , , , , , , , , , , , , , and . We demonstrate that for any activation function , a network of width and depth can be approximated to arbitrary precision by a -activated network of width and depth on any bounded set. This finding enables the extension of most approximation results achieved with networks to a wide variety of other activation functions, at the cost of slightly larger constants.
View on arXivComments on this paper
