18
11

Improving the Expressiveness of KK-hop Message-Passing GNNs by Injecting Contextualized Substructure Information

Tianjun Yao
Yiongxu Wang
Kun Zhang
Shangsong Liang
Abstract

Graph neural networks (GNNs) have become the \textit{de facto} standard for representational learning in graphs, and have achieved state-of-the-art performance in many graph-related tasks; however, it has been shown that the expressive power of standard GNNs are equivalent maximally to 1-dimensional Weisfeiler-Lehman (1-WL) Test. Recently, there is a line of works aiming to enhance the expressive power of graph neural networks. One line of such works aim at developing KK-hop message-passing GNNs where node representation is updated by aggregating information from not only direct neighbors but all neighbors within KK-hop of the node. Another line of works leverages subgraph information to enhance the expressive power which is proven to be strictly more powerful than 1-WL test. In this work, we discuss the limitation of KK-hop message-passing GNNs and propose \textit{substructure encoding function} to uplift the expressive power of any KK-hop message-passing GNN. We further inject contextualized substructure information to enhance the expressiveness of KK-hop message-passing GNNs. Our method is provably more powerful than previous works on KK-hop graph neural networks and 1-WL subgraph GNNs, which is a specific type of subgraph based GNN models, and not less powerful than 3-WL. Empirically, our proposed method set new state-of-the-art performance or achieves comparable performance for a variety of datasets. Our code is available at \url{https://github.com/tianyao-aka/Expresive_K_hop_GNNs}.

View on arXiv
Comments on this paper