ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.07883
60
3
v1v2v3 (latest)

Learning functions varying along an active subspace

22 January 2020
Hao Liu
Wenjing Liao
ArXiv (abs)PDFHTML
Abstract

Many functions of interest are in a high-dimensional space but exhibit low-dimensional structures. This paper studies regression of a sss-H\"{o}lder function fff in RD\mathbb{R}^DRD which varies along an active subspace of dimension ddd while d≪Dd\ll Dd≪D. A direct approximation of fff in RD\mathbb{R}^DRD with an ε\varepsilonε accuracy requires the number of samples nnn in the order of ε−(2s+D)/s\varepsilon^{-(2s+D)/s}ε−(2s+D)/s. In this paper, we modify the Generalized Contour Regression (GCR) algorithm to estimate the active subspace and use piecewise polynomials for function approximation. GCR is among the best estimators for the active subspace, but its sample complexity is an open question. Our modified GCR improves the efficiency over the original GCR and leads to an mean squared estimation error of O(n−1)O(n^{-1})O(n−1) for the active subspace, when nnn is sufficiently large. The mean squared regression error of fff is proved to be in the order of (n/log⁡n)−2s2s+d\left(n/\log n\right)^{-\frac{2s}{2s+d}}(n/logn)−2s+d2s​ where the exponent depends on the dimension of the active subspace ddd instead of the ambient space DDD. This result demonstrates that GCR is effective in learning low-dimensional active subspaces. The convergence rate is validated through several numerical experiments.

View on arXiv
Comments on this paper