ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.07709
18
6

Approximation power of random neural networks

18 June 2019
Bolton Bailey
Ziwei Ji
Matus Telgarsky
Ruicheng Xian
ArXivPDFHTML
Abstract

This paper investigates the approximation power of three types of random neural networks: (a) infinite width networks, with weights following an arbitrary distribution; (b) finite width networks obtained by subsampling the preceding infinite width networks; (c) finite width networks obtained by starting with standard Gaussian initialization, and then adding a vanishingly small correction to the weights. The primary result is a fully quantified bound on the rate of approximation of general general continuous functions: in all three cases, a function fff can be approximated with complexity ∥f∥1(d/δ)O(d)\|f\|_1 (d/\delta)^{\mathcal{O}(d)}∥f∥1​(d/δ)O(d), where δ\deltaδ depends on continuity properties of fff and the complexity measure depends on the weight magnitudes and/or cardinalities. Along the way, a variety of ancillary results are developed: an exact construction of Gaussian densities with infinite width networks, an elementary stand-alone proof scheme for approximation via convolutions of radial basis functions, subsampling rates for infinite width networks, and depth separation for corrected networks.

View on arXiv
Comments on this paper