27
52

Learning from MOM's principles: Le Cam's approach

Abstract

We obtain estimation error rates for estimators obtained by aggregation of regularized median-of-means tests, following a construction of Le Cam. The results hold with exponentially large probability -- as in the gaussian framework with independent noise- under only weak moments assumptions on data and without assuming independence between noise and design. Any norm may be used for regularization. When it has some sparsity inducing power we recover sparse rates of convergence. The procedure is robust since a large part of data may be corrupted, these outliers have nothing to do with the oracle we want to reconstruct. Our general risk bound is of order \begin{equation*} \max\left(\mbox{minimax rate in the i.i.d. setup}, \frac{\text{number of outliers}}{\text{number of observations}}\right) \enspace. \end{equation*}In particular, the number of outliers may be as large as (number of data) ×\times(minimax rate) without affecting this rate. The other data do not have to be identically distributed but should only have equivalent L1L^1 and L2L^2 moments. For example, the minimax rate slog(ed/s)/Ns \log(ed/s)/N of recovery of a ss-sparse vector in Rd\mathbb{R}^d is achieved with exponentially large probability by a median-of-means version of the LASSO when the noise has q0q_0 moments for some q0>2q_0>2, the entries of the design matrix should have C0log(ed)C_0\log(ed) moments and the dataset can be corrupted up to C1slog(ed/s)C_1 s \log(ed/s) outliers.

View on arXiv
Comments on this paper