All Papers
Title |
---|
Title |
---|
In masked low-rank approximation, given and binary , the goal is to find a rank- matrix for which: cost(L)=\sum_{i=1}^{n} \sum_{j=1}^{n}W_{i,j}\cdot (A_{i,j} - L_{i,j})^2\le OPT+\epsilon \|A\|_F^2, where . This problem is a special case of weighted low-rank approximation and captures low-rank plus diagonal decomposition, robust PCA, matrix completion, low-rank recovery from monotone missing data, and many other problems. Many of these problems are NP-hard, and while some algorithms with provable guarantees are known, they either 1) run in time , or 2) make strong assumptions, e.g., that is incoherent or that is random. We consider , which output with rank . We prove that a common heuristic, which simply sets to where is , and then computes a standard low-rank approximation, achieves the above approximation bound with rank depending on the of . Namely, interpreting as the communication matrix of a Boolean function with , it suffices to set , where is the randomized communication complexity of with -sided error probability . For many problems, this yields bicriteria algorithms with . Further, we show that different models of communication yield algorithms for natural variants of the problem. E.g., multi-player communication complexity connects to tensor decomposition and non-deterministic communication complexity to Boolean low-rank factorization. Finally, we conjecture a tight relationship between masked low-rank approximation and communication complexity and give some evidence in its direction.
View on arXiv