Outlier-robust sparse/low-rank least-squares regression and robust matrix completion

We study high-dimensional least-squares regression within a subgaussian statistical learning framework with heterogeneous noise. It includes -sparse and -low-rank least-squares regression when a fraction of the labels are adversarially contaminated. We also present a novel theory of trace-regression with matrix decomposition based on a new application of the product process. For these problems, we show novel near-optimal "subgaussian" estimation rates of the form , valid with probability at least . Here, is the optimal uncontaminated rate as a function of the effective dimension but independent of the failure probability . These rates are valid uniformly on , i.e., the estimators' tuning do not depend on . Lastly, we consider noisy robust matrix completion with non-uniform sampling. If only the low-rank matrix is of interest, we present a novel near-optimal rate that is independent of the corruption level . Our estimators are tractable and based on a new "sorted" Huber-type loss. No information on are needed to tune these estimators. Our analysis makes use of novel -optimal concentration inequalities for the multiplier and product processes which could be useful elsewhere. For instance, they imply novel sharp oracle inequalities for Lasso and Slope with optimal dependence on . Numerical simulations confirm our theoretical predictions. In particular, "sorted" Huber regression can outperform classical Huber regression.
View on arXiv