ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.07992
11
1

Improved analysis of randomized SVD for top-eigenvector approximation

16 February 2022
Ruo-Chun Tzeng
Po-An Wang
Florian Adriaens
A. Gionis
Chi-Jen Lu
ArXivPDFHTML
Abstract

Computing the top eigenvectors of a matrix is a problem of fundamental interest to various fields. While the majority of the literature has focused on analyzing the reconstruction error of low-rank matrices associated with the retrieved eigenvectors, in many applications one is interested in finding one vector with high Rayleigh quotient. In this paper we study the problem of approximating the top-eigenvector. Given a symmetric matrix A\mathbf{A}A with largest eigenvalue λ1\lambda_1λ1​, our goal is to find a vector \hu that approximates the leading eigenvector u1\mathbf{u}_1u1​ with high accuracy, as measured by the ratio R(u^)=λ1−1u^TAu^/u^Tu^R(\hat{\mathbf{u}})=\lambda_1^{-1}{\hat{\mathbf{u}}^T\mathbf{A}\hat{\mathbf{u}}}/{\hat{\mathbf{u}}^T\hat{\mathbf{u}}}R(u^)=λ1−1​u^TAu^/u^Tu^. We present a novel analysis of the randomized SVD algorithm of \citet{halko2011finding} and derive tight bounds in many cases of interest. Notably, this is the first work that provides non-trivial bounds of R(u^)R(\hat{\mathbf{u}})R(u^) for randomized SVD with any number of iterations. Our theoretical analysis is complemented with a thorough experimental study that confirms the efficiency and accuracy of the method.

View on arXiv
Comments on this paper