44
10
v1v2v3v4v5 (latest)

On the minimax rate of the Gaussian sequence model under bounded convex constraints

Abstract

We determine the exact minimax rate of a Gaussian sequence model under bounded convex constraints, purely in terms of the local geometry of the given constraint set KK. Our main result shows that the minimax risk (up to constant factors) under the squared 2\ell_2 loss is given by ϵ2diam(K)2\epsilon^{*2} \wedge \operatorname{diam}(K)^2 with \begin{align*} \epsilon^* = \sup \bigg\{\epsilon : \frac{\epsilon^2}{\sigma^2} \leq \log M^{\operatorname{loc}}(\epsilon)\bigg\}, \end{align*} where logMloc(ϵ)\log M^{\operatorname{loc}}(\epsilon) denotes the local entropy of the set KK, and σ2\sigma^2 is the variance of the noise. We utilize our abstract result to re-derive known minimax rates for some special sets KK such as hyperrectangles, ellipses, and more generally quadratically convex orthosymmetric sets. Finally, we extend our results to the unbounded case with known σ2\sigma^2 to show that the minimax rate in that case is ϵ2\epsilon^{*2}.

View on arXiv
Comments on this paper