ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.00322
27
4

Toward L∞L_\inftyL∞​-recovery of Nonlinear Functions: A Polynomial Sample Complexity Bound for Gaussian Random Fields

29 April 2023
Kefan Dong
Tengyu Ma
ArXivPDFHTML
Abstract

Many machine learning applications require learning a function with a small worst-case error over the entire input domain, that is, the L∞L_\inftyL∞​-error, whereas most existing theoretical works only guarantee recovery in average errors such as the L2L_2L2​-error. L∞L_\inftyL∞​-recovery from polynomial samples is even impossible for seemingly simple function classes such as constant-norm infinite-width two-layer neural nets. This paper makes some initial steps beyond the impossibility results by leveraging the randomness in the ground-truth functions. We prove a polynomial sample complexity bound for random ground-truth functions drawn from Gaussian random fields. Our key technical novelty is to prove that the degree-kkk spherical harmonics components of a function from Gaussian random field cannot be spiky in that their L∞L_\inftyL∞​/L2L_2L2​ ratios are upperbounded by O(dln⁡k)O(d \sqrt{\ln k})O(dlnk​) with high probability. In contrast, the worst-case L∞L_\inftyL∞​/L2L_2L2​ ratio for degree-kkk spherical harmonics is on the order of Ω(min⁡{dk/2,kd/2})\Omega(\min\{d^{k/2},k^{d/2}\})Ω(min{dk/2,kd/2}).

View on arXiv
Comments on this paper