Outlier-robust sparse/low-rank least-squares regression and robust
matrix completion
We consider high-dimensional least-squares regression when a fraction of the labels are contaminated by an arbitrary adversary. We analyze such problem in the statistical learning framework with a subgaussian distribution and linear hypothesis class on the space of matrices. As such, we allow the noise to be heterogeneous. This framework includes sparse linear regression and low-rank trace-regression. For a -dimensional -sparse parameter, we show that a convex regularized -estimator using a sorted Huber-type loss achieves the near-optimal subgaussian rate with probability at least . For a -dimensional parameter with rank , a nuclear-norm regularized -estimator using the same sorted Huber-type loss achieves the subgaussian rate again optimal up to a log factor. In a second part, we study the trace-regression problem when the parameter is the sum of a matrix with rank plus a -sparse matrix assuming the "low-spikeness" condition. Unlike multivariate regression studied in previous work, the design in trace-regression lacks positive-definiteness in high-dimensions. Still, we show that a regularized least-squares estimator achieves the subgaussian rate Lastly, we consider noisy matrix completion with non-uniform sampling when a fraction of the sampled low-rank matrix is corrupted by outliers. If only the low-rank matrix is of interest, we show that a nuclear-norm regularized Huber-type estimator achieves, up to log factors, the optimal rate adaptively to the corruption level. The above mentioned rates require no information on .
View on arXiv