73
16

Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

Abstract

In this paper we study the statistical properties of Laplacian smoothing, a graph-based approach to nonparametric regression. Under standard regularity conditions, we establish upper bounds on the error of the Laplacian smoothing estimator f^\widehat{f}, and a goodness-of-fit test also based on f^\widehat{f}. These upper bounds match the minimax optimal estimation and testing rates of convergence over the first-order Sobolev class H1(X)H^1(\mathcal{X}), for XRd\mathcal{X}\subseteq \mathbb{R}^d and 1d<41 \leq d < 4; in the estimation problem, for d=4d = 4, they are optimal modulo a logn\log n factor. Additionally, we prove that Laplacian smoothing is manifold-adaptive: if XRd\mathcal{X} \subseteq \mathbb{R}^d is an mm-dimensional manifold with m<dm < d, then the error rate of Laplacian smoothing (in either estimation or testing) depends only on mm, in the same way it would if X\mathcal{X} were a full-dimensional set in Rd\mathbb{R}^d.

View on arXiv
Comments on this paper