ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.15772
22
4

Weighted variation spaces and approximation by shallow ReLU networks

28 July 2023
Ronald A. DeVore
Robert D. Nowak
Rahul Parhi
Jonathan W. Siegel
ArXivPDFHTML
Abstract

We investigate the approximation of functions fff on a bounded domain Ω⊂Rd\Omega\subset \mathbb{R}^dΩ⊂Rd by the outputs of single-hidden-layer ReLU neural networks of width nnn. This form of nonlinear nnn-term dictionary approximation has been intensely studied since it is the simplest case of neural network approximation (NNA). There are several celebrated approximation results for this form of NNA that introduce novel model classes of functions on Ω\OmegaΩ whose approximation rates avoid the curse of dimensionality. These novel classes include Barron classes, and classes based on sparsity or variation such as the Radon-domain BV classes. The present paper is concerned with the definition of these novel model classes on domains Ω\OmegaΩ. The current definition of these model classes does not depend on the domain Ω\OmegaΩ. A new and more proper definition of model classes on domains is given by introducing the concept of weighted variation spaces. These new model classes are intrinsic to the domain itself. The importance of these new model classes is that they are strictly larger than the classical (domain-independent) classes. Yet, it is shown that they maintain the same NNA rates.

View on arXiv
Comments on this paper