32
11

Empirical Risk Minimization as Parameter Choice Rule for General Linear Regularization Methods

Abstract

We consider the statistical inverse problem to recover ff from noisy measurements Y=Tf+σξY = Tf + \sigma \xi where ξ\xi is Gaussian white noise and TT a compact operator between Hilbert spaces. Considering general reconstruction methods of the form f^α=qα(TT)TY\hat f_\alpha = q_\alpha \left(T^*T\right)T^*Y with an ordered filter qαq_\alpha, we investigate the choice of the regularization parameter α\alpha by minimizing an unbiased estimate of the predictive risk E[TfTf^α2]\mathbb E\left[\Vert Tf - T\hat f_\alpha\Vert^2\right]. The corresponding parameter αpred\alpha_{\mathrm{pred}} and its usage are well-known in the literature, but oracle inequalities and optimality results in this general setting are unknown. We prove a (generalized) oracle inequality, which relates the direct risk E[ff^αpred2]\mathbb E\left[\Vert f - \hat f_{\alpha_{\mathrm{pred}}}\Vert^2\right] with the oracle prediction risk infα>0E[TfTf^α2]\inf_{\alpha>0}\mathbb E\left[\Vert Tf - T\hat f_{\alpha}\Vert^2\right]. From this oracle inequality we are then able to conclude that the investigated parameter choice rule is of optimal order. Finally we also present numerical simulations, which support the order optimality of the method and the quality of the parameter choice in finite sample situations.

View on arXiv
Comments on this paper