Neural network integral representations with the ReLU activation
function
Mathematical and Scientific Machine Learning (MSML), 2019
Abstract
We derive a formula for the integral representation of a shallow neural network with the ReLU activation function under the finite -norm assumption on the outer weights with respect to Lebesgue measure on the sphere. In the case of univariate target functions, we further provide a closed-form formula for all possible representations. Additionally, in this case, our formula allows one to explicitly solve the least -norm neural network representation for a given function.
View on arXivComments on this paper
