ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.07862
52
3
v1v2 (latest)

Fast Computation of Generalized Eigenvectors for Manifold Graph Embedding

15 December 2021
Fei Chen
Gene Cheung
Xue Zhang
ArXiv (abs)PDFHTML
Abstract

Our goal is to efficiently compute low-dimensional latent coordinates for nodes in an input graph -- known as graph embedding -- for subsequent data processing such as clustering. Focusing on finite graphs that are interpreted as uniformly samples on continuous manifolds (called manifold graphs), we leverage existing fast extreme eigenvector computation algorithms for speedy execution. We first pose a generalized eigenvalue problem for sparse matrix pair (\A,\B)(\A,\B)(\A,\B), where \A=\L−μ\Q+ϵ\I\A = \L - \mu \Q + \epsilon \I\A=\L−μ\Q+ϵ\I is a sum of graph Laplacian \L\L\L and disconnected two-hop difference matrix \Q\Q\Q. Eigenvector \v minimizing Rayleigh quotient \frac{\v^{\top} \A \v}{\v^{\top} \v} thus minimizes 111-hop neighbor distances while maximizing distances between disconnected 222-hop neighbors, preserving graph structure. Matrix \B=diag({\bi})\B = \text{diag}(\{\b_i\})\B=diag({\bi​}) that defines eigenvector orthogonality is then chosen so that boundary / interior nodes in the sampling domain have the same generalized degrees. KKK-dimensional latent vectors for the NNN graph nodes are the first KKK generalized eigenvectors for (\A,\B)(\A,\B)(\A,\B), computed in \cO(N)\cO(N)\cO(N) using LOBPCG, where K≪NK \ll NK≪N. Experiments show that our embedding is among the fastest in the literature, while producing the best clustering performance for manifold graphs.

View on arXiv
Comments on this paper