On Risk Bounds in Isotonic and Other Shape Restricted Regression Problems

We consider the problem of estimating an unknown from noisy observations under the constraint that belongs to certain convex polyhedral cones in . Under this setting, we prove bounds for the risk of the least squares estimator (LSE). The obtained risk bound behaves differently depending on the true sequence which highlights the adaptive behavior of . As special cases of our general result, we derive risk bounds for the LSE in univariate isotonic and convex regression. We study the risk bound in isotonic regression in greater detail -- we show that the isotonic LSE converges at a whole range of rates from (when is constant) to (when is uniformly increasing in a certain sense). We argue that the bound presents a benchmark for the risk of any estimator in isotonic regression by proving non-asymptotic local minimax lower bounds. We prove an analogue of our bound for model misspecification where the true is not necessarily non-decreasing.
View on arXiv