ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.01842
11
90

Nonparametric Regression on Low-Dimensional Manifolds using Deep ReLU Networks : Function Approximation and Statistical Recovery

5 August 2019
Minshuo Chen
Haoming Jiang
Wenjing Liao
T. Zhao
ArXivPDFHTML
Abstract

Real world data often exhibit low-dimensional geometric structures, and can be viewed as samples near a low-dimensional manifold. This paper studies nonparametric regression of H\"{o}lder functions on low-dimensional manifolds using deep ReLU networks. Suppose nnn training data are sampled from a H\"{o}lder function in Hs,α\mathcal{H}^{s,\alpha}Hs,α supported on a ddd-dimensional Riemannian manifold isometrically embedded in RD\mathbb{R}^DRD, with sub-gaussian noise. A deep ReLU network architecture is designed to estimate the underlying function from the training data. The mean squared error of the empirical estimator is proved to converge in the order of n−2(s+α)2(s+α)+dlog⁡3nn^{-\frac{2(s+\alpha)}{2(s+\alpha) + d}}\log^3 nn−2(s+α)+d2(s+α)​log3n. This result shows that deep ReLU networks give rise to a fast convergence rate depending on the data intrinsic dimension ddd, which is usually much smaller than the ambient dimension DDD. It therefore demonstrates the adaptivity of deep ReLU networks to low-dimensional geometric structures of data, and partially explains the power of deep ReLU networks in tackling high-dimensional data with low-dimensional geometric structures.

View on arXiv
Comments on this paper