139
138

Improved risk bounds in isotonic regression

Abstract

We consider the problem of estimating an unknown non-decreasing sequence θ\theta from finitely many noisy observations. We give an improved global risk upper bound for the isotonic least squares estimator (LSE) in this problem. The obtained risk bound behaves differently depending on the form of the true sequence θ\theta -- one gets a whole range of rates from logn/n\log n/n (when θ\theta is constant) to n2/3n^{-2/3} (when θ\theta is \textit{uniformly increasing} in a certain sense). In particular, when θ\theta has kk constant pieces then the risk bound becomes (k/n)log(en/k)(k/n) \log (e n/k). As a consequence, we illustrate the adaptation properties of the LSE. We also prove an analogue of the risk bound for model misspecification, i.e., when θ\theta is not non-decreasing. We also prove local minimax lower bounds for this problem which show that the LSE is nearly optimal in a local nonasymptotic minimax sense.

View on arXiv
Comments on this paper