ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.07046
16
71

An Improved Gap-Dependency Analysis of the Noisy Power Method

23 February 2016
Maria-Florina Balcan
S. Du
Yining Wang
Adams Wei Yu
ArXivPDFHTML
Abstract

We consider the noisy power method algorithm, which has wide applications in machine learning and statistics, especially those related to principal component analysis (PCA) under resource (communication, memory or privacy) constraints. Existing analysis of the noisy power method shows an unsatisfactory dependency over the "consecutive" spectral gap (σk−σk+1)(\sigma_k-\sigma_{k+1})(σk​−σk+1​) of an input data matrix, which could be very small and hence limits the algorithm's applicability. In this paper, we present a new analysis of the noisy power method that achieves improved gap dependency for both sample complexity and noise tolerance bounds. More specifically, we improve the dependency over (σk−σk+1)(\sigma_k-\sigma_{k+1})(σk​−σk+1​) to dependency over (σk−σq+1)(\sigma_k-\sigma_{q+1})(σk​−σq+1​), where qqq is an intermediate algorithm parameter and could be much larger than the target rank kkk. Our proofs are built upon a novel characterization of proximity between two subspaces that differ from canonical angle characterizations analyzed in previous works. Finally, we apply our improved bounds to distributed private PCA and memory-efficient streaming PCA and obtain bounds that are superior to existing results in the literature.

View on arXiv
Comments on this paper