Krylov Methods are (nearly) Optimal for Low-Rank Approximation

We consider the problem of rank- low-rank approximation (LRA) in the matrix-vector product model under various Schatten norms: \min_{\|u\|_2=1} \|A (I - u u^\top)\|_{\mathcal{S}_p} , where denotes the norm of the singular values of . Given , our goal is to output a unit vector such that \|A(I - vv^\top)\|_{\mathcal{S}_p} \leq (1+\varepsilon) \min_{\|u\|_2=1}\|A(I - u u^\top)\|_{\mathcal{S}_p}. Our main result shows that Krylov methods (nearly) achieve the information-theoretically optimal number of matrix-vector products for Spectral (), Frobenius () and Nuclear () LRA. In particular, for Spectral LRA, we show that any algorithm requires matrix-vector products, exactly matching the upper bound obtained by Krylov methods [MM15, BCW22]. Our lower bound addresses Open Question 1 in [Woo14], providing evidence for the lack of progress on algorithms for Spectral LRA and resolves Open Question 1.2 in [BCW22]. Next, we show that for any fixed constant , i.e. , there is an upper bound of matrix-vector products, implying that the complexity does not grow as a function of input size. This improves the bound recently obtained in [BCW22], and matches their lower bound, to a factor.
View on arXiv