89
v1v2 (latest)

Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan

Main:12 Pages
Bibliography:2 Pages
Appendix:3 Pages
Abstract

We revisit the sample and computational complexity of completing a rank-1 tensor in i=1NRd\otimes_{i=1}^{N} \mathbb{R}^{d}, given a uniformly sampled subset of its entries. We present a characterization of the problem (i.e. nonzero entries) which admits an algorithm amounting to Gauss-Jordan on a pair of random linear systems. For example, when N=Θ(1)N = \Theta(1), we prove it uses no more than m=O(d2logd)m = O(d^2 \log d) samples and runs in O(md2)O(md^2) time. Moreover, we show any algorithm requires Ω(dlogd)\Omega(d\log d) samples. By contrast, existing upper bounds on the sample complexity are at least as large as d1.5μΩ(1)logΩ(1)dd^{1.5} \mu^{\Omega(1)} \log^{\Omega(1)} d, where μ\mu can be Θ(d)\Theta(d) in the worst case. Prior work obtained these looser guarantees in higher rank versions of our problem, and tend to involve more complicated algorithms.

View on arXiv
Comments on this paper