Low-Rank Approximation with Matrix-Vector Products

We study iterative methods based on Krylov subspaces for low-rank approximation under any Schatten- norm. Here, given access to a matrix through matrix-vector products, an accuracy parameter , and a target rank , the goal is to find a rank- matrix with orthonormal columns such that , where denotes the norm of the the singular values of . For the special cases of (Frobenius norm) and (Spectral norm), Musco and Musco (NeurIPS 2015) obtained an algorithm based on Krylov methods that uses matrix-vector products, improving on the na\"ive dependence obtainable by the power method, where suppresses poly factors. Our main result is an algorithm that uses only matrix-vector products, and works for all . For our bound improves the previous bound to . Since the Schatten- and Schatten- norms are the same up to a -factor when , our bound recovers the result of Musco and Musco for . Further, we prove a matrix-vector query lower bound of for any fixed constant , showing that surprisingly is the optimal complexity for constant~. To obtain our results, we introduce several new techniques, including optimizing over multiple Krylov subspaces simultaneously, and pinching inequalities for partitioned operators. Our lower bound for uses the Araki-Lieb-Thirring trace inequality, whereas for , we appeal to a norm-compression inequality for aligned partitioned operators.
View on arXiv