ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.07968
46
1
v1v2v3 (latest)

Characterizing the minimax rate of nonparametric regression under bounded convex constraints

15 January 2024
Akshay Prasadan
Matey Neykov
ArXiv (abs)PDFHTML
Abstract

We quantify the minimax rate for a nonparametric regression model over a convex function class F\mathcal{F}F with bounded diameter. We obtain a minimax rate of ε∗2∧diam(F)2{\varepsilon^{\ast}}^2\wedge\mathrm{diam}(\mathcal{F})^2ε∗2∧diam(F)2 where \[\varepsilon^{\ast} =\sup\{\varepsilon>0:n\varepsilon^2 \le \log M_{\mathcal{F}}^{\operatorname{loc}}(\varepsilon,c)\},\] where MFloc⁡(⋅,c)M_{\mathcal{F}}^{\operatorname{loc}}(\cdot, c)MFloc​(⋅,c) is the local metric entropy of F\mathcal{F}F and our loss function is the squared population L2L_2L2​ distance over our input space X\mathcal{X}X. In contrast to classical works on the topic [cf. Yang and Barron, 1999], our results do not require functions in F\mathcal{F}F to be uniformly bounded in sup-norm. In addition, we prove that our estimator is adaptive to the true point, and to the best of our knowledge this is the first such estimator in this general setting. This work builds on the Gaussian sequence framework of Neykov [2022] using a similar algorithmic scheme to achieve the minimax rate. Our algorithmic rate also applies with sub-Gaussian noise. We illustrate the utility of this theory with examples including multivariate monotone functions, linear functionals over ellipsoids, and Lipschitz classes.

View on arXiv
Comments on this paper