248

Neural network integral representations with the ReLU activation function

Mathematical and Scientific Machine Learning (MSML), 2019
Abstract

We derive a formula for neural network integral representations on the sphere with the ReLU activation function under the finite L1L_1 norm (with respect to Lebesgue measure on the sphere) assumption on the outer weights. In one dimensional case, we further solve via a closed-form formula all possible such representations. Additionally, in this case our formula allows one to explicitly solve the least L1L_1 norm neural network representation for a given function.

View on arXiv
Comments on this paper