17
9

Outlier-robust sparse/low-rank least-squares regression and robust matrix completion

Abstract

We study high-dimensional least-squares regression within a subgaussian statistical learning framework with heterogeneous noise. It includes ss-sparse and rr-low-rank least-squares regression when a fraction ϵ\epsilon of the labels are adversarially contaminated. We also present a novel theory of trace-regression with matrix decomposition based on a new application of the product process. For these problems, we show novel near-optimal "subgaussian" estimation rates of the form r(n,de)+log(1/δ)/n+ϵlog(1/ϵ)r(n,d_{e})+\sqrt{\log(1/\delta)/n}+\epsilon\log(1/\epsilon), valid with probability at least 1δ1-\delta. Here, r(n,de)r(n,d_{e}) is the optimal uncontaminated rate as a function of the effective dimension ded_{e} but independent of the failure probability δ\delta. These rates are valid uniformly on δ\delta, i.e., the estimators' tuning do not depend on δ\delta. Lastly, we consider noisy robust matrix completion with non-uniform sampling. If only the low-rank matrix is of interest, we present a novel near-optimal rate that is independent of the corruption level aa. Our estimators are tractable and based on a new "sorted" Huber-type loss. No information on (s,r,ϵ,a)(s,r,\epsilon,a) are needed to tune these estimators. Our analysis makes use of novel δ\delta-optimal concentration inequalities for the multiplier and product processes which could be useful elsewhere. For instance, they imply novel sharp oracle inequalities for Lasso and Slope with optimal dependence on δ\delta. Numerical simulations confirm our theoretical predictions. In particular, "sorted" Huber regression can outperform classical Huber regression.

View on arXiv
Comments on this paper