ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1607.07819
98
129

Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with ℓ1 \ell^1 ℓ1 and ℓ0 \ell^0 ℓ0 Controls

26 July 2016
Jason M. Klusowski
Andrew R. Barron
ArXivPDFHTML
Abstract

We establish L∞ L^{\infty} L∞ and L2 L^2 L2 error bounds for functions of many variables that are approximated by linear combinations of ReLU (rectified linear unit) and squared ReLU ridge functions with ℓ1 \ell^1 ℓ1 and ℓ0 \ell^0 ℓ0 controls on their inner and outer parameters. With the squared ReLU ridge function, we show that the L2 L^2 L2 approximation error is inversely proportional to the inner layer ℓ0 \ell^0 ℓ0 sparsity and it need only be sublinear in the outer layer ℓ0 \ell^0 ℓ0 sparsity. Our constructions are obtained using a variant of the Jones-Barron probabilistic method, which can be interpreted as either stratified sampling with proportionate allocation or two-stage cluster sampling. We also provide companion error lower bounds that reveal near optimality of our constructions. Despite the sparsity assumptions, we showcase the richness and flexibility of these ridge combinations by defining a large family of functions, in terms of certain spectral conditions, that are particularly well approximated by them.

View on arXiv
Comments on this paper