ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.12054
10
14

From Local to Global: Spectral-Inspired Graph Neural Networks

24 September 2022
Ningyuan Huang
Soledad Villar
Carey E. Priebe
Da Zheng
Cheng-Fu Huang
Lin F. Yang
Vladimir Braverman
ArXivPDFHTML
Abstract

Graph Neural Networks (GNNs) are powerful deep learning methods for Non-Euclidean data. Popular GNNs are message-passing algorithms (MPNNs) that aggregate and combine signals in a local graph neighborhood. However, shallow MPNNs tend to miss long-range signals and perform poorly on some heterophilous graphs, while deep MPNNs can suffer from issues like over-smoothing or over-squashing. To mitigate such issues, existing works typically borrow normalization techniques from training neural networks on Euclidean data or modify the graph structures. Yet these approaches are not well-understood theoretically and could increase the overall computational complexity. In this work, we draw inspirations from spectral graph embedding and propose PowerEmbed\texttt{PowerEmbed}PowerEmbed -- a simple layer-wise normalization technique to boost MPNNs. We show PowerEmbed\texttt{PowerEmbed}PowerEmbed can provably express the top-kkk leading eigenvectors of the graph operator, which prevents over-smoothing and is agnostic to the graph topology; meanwhile, it produces a list of representations ranging from local features to global signals, which avoids over-squashing. We apply PowerEmbed\texttt{PowerEmbed}PowerEmbed in a wide range of simulated and real graphs and demonstrate its competitive performance, particularly for heterophilous graphs.

View on arXiv
Comments on this paper