258

On matrix estimation under monotonicity constraints

Abstract

We consider the problem of estimating an unknown n1×n2n_1 \times n_2 matrix θ\mathbf{\theta^*} from noisy observations under the constraint that θ\mathbf{\theta}^* is nondecreasing in both rows and columns. We consider the least squares estimator (LSE) in this setting and study its risk properties. We show that the worst case risk of the LSE is n1/2n^{-1/2}, up to logarithmic factors, where n=n1n2n = n_1 n_2 and that the LSE is minimax rate optimal upto logarithmic factors. We further prove that for some special θ\mathbf{\theta}^*, the risk of the LSE could be much smaller than n1/2n^{-1/2}; in fact, could even be parametric i.e., n1n^{-1} upto logarithmic factors. We derive, as a consequence, an interesting adaptation property of the LSE which we term variable adaptation.

View on arXiv
Comments on this paper