282
v1v2v3 (latest)

On the Complexity of Robust PCA and 1\ell_1-norm Low-Rank Matrix Approximation

Abstract

The low-rank matrix approximation problem with respect to the component-wise 1\ell_1-norm (1\ell_1-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims at recovering a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although 1\ell_1-LRA is strongly believed to be NP-hard, there is, to the best of our knowledge, no formal proof of this fact. In this paper, we prove that 1\ell_1-LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between 1\ell_1-LRA and several other well-known problems, namely, robust PCA, 0\ell_0-LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {1,+1}\{-1,+1\} matrices, and the discrete basis problem, which we all prove to be NP-hard.

View on arXiv
Comments on this paper