Adaptive minimax optimality in statistical inverse problems via SOLIT -- Sharp Optimal Lepskii-Inspired Tuning

We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form , where is the available data, the forward operator, an ordered filter, and a regularization parameter. Whenever such a method is used in practice, has to be chosen appropriately. Typically, the aim is to find or at least approximate the best possible in the sense that mean squared error (MSE) w.r.t.~the true solution is minimized. In this paper, we introduce the Sharp Optimal Lepski\u{\i}-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on and the noise level as well as the operator and the filter and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiii-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
View on arXiv