ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.04499
6
6

DeGNN: Characterizing and Improving Graph Neural Networks with Graph Decomposition

10 October 2019
Xupeng Miao
Nezihe Merve Gürel
Wentao Zhang
Zhichao Han
Bo-wen Li
Wei Min
Xi Rao
Hansheng Ren
Yinan Shan
Yingxia Shao
Yujie Wang
Fan Wu
Hui Xue
Yaming Yang
Zitao Zhang
Yang Zhao
Shuai Zhang
Yujing Wang
Bin Cui
Ce Zhang
    GNN
ArXivPDFHTML
Abstract

Despite the wide application of Graph Convolutional Network (GCN), one major limitation is that it does not benefit from the increasing depth and suffers from the oversmoothing problem. In this work, we first characterize this phenomenon from the information-theoretic perspective and show that under certain conditions, the mutual information between the output after lll layers and the input of GCN converges to 0 exponentially with respect to lll. We also show that, on the other hand, graph decomposition can potentially weaken the condition of such convergence rate, which enabled our analysis for GraphCNN. While different graph structures can only benefit from the corresponding decomposition, in practice, we propose an automatic connectivity-aware graph decomposition algorithm, DeGNN, to improve the performance of general graph neural networks. Extensive experiments on widely adopted benchmark datasets demonstrate that DeGNN can not only significantly boost the performance of corresponding GNNs, but also achieves the state-of-the-art performances.

View on arXiv
Comments on this paper