ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.11231
47
3

Do ReLU Networks Have An Edge When Approximating Compactly-Supported Functions?

24 April 2022
Anastasis Kratsios
Behnoosh Zamanlooy
    MLT
ArXivPDFHTML
Abstract

We study the problem of approximating compactly-supported integrable functions while implementing their support set using feedforward neural networks. Our first main result transcribes this "structured" approximation problem into a universality problem. We do this by constructing a refinement of the usual topology on the space Lloc⁡1(Rd,RD)L^1_{\operatorname{loc}}(\mathbb{R}^d,\mathbb{R}^D)Lloc1​(Rd,RD) of locally-integrable functions in which compactly-supported functions can only be approximated in L1L^1L1-norm by functions with matching discretized support. We establish the universality of ReLU feedforward networks with bilinear pooling layers in this refined topology. Consequentially, we find that ReLU feedforward networks with bilinear pooling can approximate compactly supported functions while implementing their discretized support. We derive a quantitative uniform version of our universal approximation theorem on the dense subclass of compactly-supported Lipschitz functions. This quantitative result expresses the depth, width, and the number of bilinear pooling layers required to construct this ReLU network via the target function's regularity, the metric capacity and diameter of its essential support, and the dimensions of the inputs and output spaces. Conversely, we show that polynomial regressors and analytic feedforward networks are not universal in this space.

View on arXiv
Comments on this paper