Conditional regression for the Nonlinear Single-Variable Model
Regressing a function on without the statistical and computational curse of dimensionality requires special statistical models, for example that impose geometric assumptions on the distribution of the data (e.g., that its support is low-dimensional), or strong smoothness assumptions on , or a special structure . Among the latter, compositional models with mapping to with include classical single- and multi-index models, as well as neural networks. While the case where is linear is well-understood, less is known when is nonlinear, and in particular for which 's the curse of dimensionality in estimating , or both and , may be circumvented. Here we consider a model where is the closest-point projection onto the parameter of a regular curve , and . The input data is not low-dimensional: it can be as far from as the condition that is well-defined allows. The distribution , the curve and the function are all unknown. This model is a natural nonlinear generalization of the single-index model, corresponding to being a line. We propose a nonparametric estimator, based on conditional regression, that under suitable assumptions, the strongest of which being that is coarsely monotone, achieves, up to log factors, the optimal min-max rate for non-parametric regression, up to the level of noise in the observations, and be constructed in time . All the constants in the learning bounds, in the minimal number of samples required for our bounds to hold, and in the computational complexity are at most low-order polynomials in .
View on arXiv