46
32

Minimax Risk Bounds for Piecewise Constant Models

Abstract

Consider a sequence of data points X1,,XnX_1,\ldots, X_n whose underlying mean θRn\theta^*\in\mathbb{R}^n is piecewise constant of at most kk^* pieces. This paper establishes sharp nonasymptotic risk bounds for the least squares estimator (LSE) on estimating θ\theta^*. The main results are twofold. First, when there is no additional shape constraint assumed, we reveal a new phase transition for the risk of LSE: As kk^* increases from 2 to higher, the rate changes from loglogn\log\log n to klog(en/k)k^*\log(en/k^*). Secondly, when θ\theta^* is further assumed to be nondecreasing, we show the rate is improved to be kloglog(16n/k)k^*\log\log(16n/k^*) over 2kn2\leq k^*\leq n. These bounds are sharp in the sense that they match the minimax lower bounds of the studied problems (without sacrificing any logarithmic factor). They complement their counterpart in the change-point detection literature and fill some notable gaps in recent discoveries relating isotonic regression to piecewise constant models. The techniques developed in the proofs, which are built on Levy's partial sum and Doob's martingale theory, are of independent interest and may have potential applications to the study of some other shape-constrained regression problems.

View on arXiv
Comments on this paper