162
101

Fast Global Convergence of Online PCA

Abstract

We study online principle component analysis (PCA), that is to find the top kk eigenvectors of a d×dd\times d hidden matrix Σ\bf \Sigma with online data samples drawn from covariance matrix Σ\bf \Sigma. We provide globalglobal convergence for the low-rank generalization of Oja's algorithm, which is popularly used in practice but lacks theoretical understanding. Our convergence rate matches the lower bound in terms of the dependency on error, on eigengap and on dimension dd; in addition, our convergence rate can be made gap-free, that is proportional to the approximation error and independent of the eigengap. In contrast, for general rank kk, before our work (1) it was open to design any algorithm with efficient global convergence rate; and (2) it was open to design any algorithm with (even local) gap-free convergence rate.

View on arXiv
Comments on this paper