The performance of Least Squares estimators is studied in shape restricted regression for convex cones that include nondecreasing sequences, convex sequences and higher order cones. We derive sharp oracle inequalities for the Least Squares estimator, i.e., oracle inequalities with leading constant 1. Two types are oracle inequalities are derived. The inequalities of the first type are adaptive in the sense that the rate becomes parametric if the true sequence can be well approximated by a sequence that satisfies some low-dimensional property. The inequalities of the second type yield a rate that corresponds to the nonparametric rate of smoothness classes under a localized Gaussian width assumption. The oracle inequalities hold in deviation with exponential probability bounds and in expectation. To obtain our results, we improve the best known bounds on the statistical dimension of the cone of convex sequences, and we derive upper bounds on the statistical dimension of higher order cones. Then we construct an estimator that aggregates two projections on opposite convex cones. In isotonic regression, the estimator adapts to the best direction of monotonicity. In convex regression, the estimator mimics the best behavior among concavity and convexity. Our estimators are fully data-driven and computationally tractable.
View on arXiv