ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.09693
17
13

Generative Principal Component Analysis

18 March 2022
Zhaoqiang Liu
Jiulong Liu
Subhro Ghosh
Jun Han
Jonathan Scarlett
ArXivPDFHTML
Abstract

In this paper, we study the problem of principal component analysis with generative modeling assumptions, adopting a general model for the observed matrix that encompasses notable special cases, including spiked matrix recovery and phase retrieval. The key assumption is that the underlying signal lies near the range of an LLL-Lipschitz continuous generative model with bounded kkk-dimensional inputs. We propose a quadratic estimator, and show that it enjoys a statistical rate of order klog⁡Lm\sqrt{\frac{k\log L}{m}}mklogL​​, where mmm is the number of samples. We also provide a near-matching algorithm-independent lower bound. Moreover, we provide a variant of the classic power method, which projects the calculated data onto the range of the generative model during each iteration. We show that under suitable conditions, this method converges exponentially fast to a point achieving the above-mentioned statistical rate. We perform experiments on various image datasets for spiked matrix and phase retrieval models, and illustrate performance gains of our method to the classic power method and the truncated power method devised for sparse principal component analysis.

View on arXiv
Comments on this paper