ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1607.07837
160
101
v1v2v3v4 (latest)

First Efficient Convergence for Streaming k-PCA: a Global, Gap-Free, and Near-Optimal Rate

26 July 2016
Zeyuan Allen-Zhu
Yuanzhi Li
ArXiv (abs)PDFHTML
Abstract

We study streaming principal component analysis (PCA), that is to find, in O(dk)O(dk)O(dk) space, the top kkk eigenvectors of a d×dd\times dd×d hidden matrix Σ\bf \SigmaΣ with online vectors drawn from covariance matrix Σ\bf \SigmaΣ. We provide global\textit{global}global convergence for Oja's algorithm which is popularly used in practice but lacks theoretical understanding for k>1k>1k>1. We also provide a modified variant Oja++\mathsf{Oja}^{++}Oja++ that runs even faster\textit{even faster}even faster than Oja's. Our results match the information theoretic lower bound in terms of dependency on error, on eigengap, on rank kkk, and on dimension ddd, up to poly-log factors. In addition, our convergence rate can be made gap-free, that is proportional to the approximation error and independent of the eigengap. In contrast, for general rank kkk, before our work (1) it was open to design any algorithm with efficient global convergence rate; and (2) it was open to design any algorithm with (even local) gap-free convergence rate in O(dk)O(dk)O(dk) space.

View on arXiv
Comments on this paper