298

Low-Rank Matrix Completion with Adversarial Missing Entries

Abstract

We show that, so long as the number of missing entries in any row or column is bounded by a function of the dimension, rank, and incoherence of the matrix, nuclear norm minimization recovers the target matrix exactly. The range for which this guarantee holds is surprisingly large--in an n×nn\times n matrix of constant rank, there may be as many as Ω(n)\Omega(n) entries missing in every row and column. Conversely, if only constantly many entries are missing in any row or column, then we may recover matrices of rank Ω(n)\Omega(n). We also use adversarial matrix completion to give an algorithm for completing an order-mm symmetric low-rank tensor from its multilinear entries in time roughly proportional to the number of tensor entries. We apply our tensor completion algorithm to the problem of learning mixtures of product distributions over the hypercube, obtaining new algorithmic results. If the centers of the product distribution are linearly independent, then we recover distributions with as many as Ω(n)\Omega(n) centers in polynomial time and sample complexity. In the general case, we recover distributions with as many as Ω~(n)\tilde\Omega(n) centers in quasi-polynomial time, answering an open problem of Feldman et al. (SIAM J. Comp.) for the special case of distributions with incoherent bias vectors.

View on arXiv
Comments on this paper