Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

The expressive power of message passing neural networks (MPNNs) is known to match the expressive power of the 1-dimensional Weisfeiler-Leman graph (1-WL) isomorphism test. To boost the expressive power of MPNNs, a number of graph neural network architectures have recently been proposed based on higher-dimensional Weisfeiler-Leman tests. In this paper we consider the two-dimensional (2-WL) test and introduce a new type of MPNNs, referred to as -walk MPNNs, which aggregate features along walks of length between vertices. We show that -walk MPNNs match 2-WL in expressive power. More generally, -walk MPNNs, for any , are shown to match the expressive power of the recently introduced -walk refinement procedure (W[]). Based on a correspondence between 2-WL and W[], we observe that -walk MPNNs and -walk MPNNs have the same expressive power, i.e., they can distinguish the same pairs of graphs, but -walk MPNNs can possibly distinguish pairs of graphs faster than -walk MPNNs. When it comes to concrete learnable graph neural network (GNN) formalisms that match 2-WL or W[] in expressive power, we consider second-order graph neural networks that allow for non-linear layers. In particular, to match W[] in expressive power, we allow matrix multiplications in each layer. We propose different versions of second-order GNNs depending on the type of features (i.e., coming from a countable set, or coming from an uncountable set) as this affects the number of dimensions needed to represent the features. Our results indicate that increasing non-linearity in layers by means of allowing multiple matrix multiplications does not increase expressive power. At the very best, it results in a faster distinction of input graphs.
View on arXiv