Global risk bounds and adaptation in univariate convex regression

We consider the problem of nonparametric estimation of a convex regression function . We study global risk bounds and adaptation properties of the least squares estimator (LSE) of . Under the natural squared error loss, we show that the risk of the LSE is bounded from above by upto a multiplicative factor that is logarithmic in . When is convex and piecewise affine with knots, we establish adaptation of the LSE by showing that its risk is bounded from above by upto logarithmic multiplicative factors. On the other hand, when has curvature, we show that no estimator can have risk smaller than a constant multiple of in a very strong sense by proving a "local" minimax lower bound. We also study the case of model misspecification where we show that the LSE exhibits the same global behavior provided the loss is measured from the closest convex projection of the true regression function. In addition to the convex LSE, we also provide risk bounds for a natural sieved LSE. In the process of proving our results, we establish some new results on the covering numbers of classes of convex functions which are of independent interest.
View on arXiv