Improved risk bounds in isotonic regression

We consider the problem of estimating an unknown non-decreasing sequence from finitely many noisy observations. We give an improved global risk upper bound for the isotonic least squares estimator (LSE) in this problem. The obtained risk bound behaves differently depending on the form of the true sequence -- one gets a whole range of rates from (when is constant) to (when is \textit{uniformly increasing} in a certain sense). In particular, when has constant pieces then the risk bound becomes . As a consequence, we illustrate the adaptation properties of the LSE. We also prove an analogue of the risk bound for model misspecification, i.e., when is not non-decreasing. We also prove local minimax lower bounds for this problem which show that the LSE is nearly optimal in a local nonasymptotic minimax sense.
View on arXiv