13
8

Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

Abstract

The expressive power of message passing neural networks (MPNNs) is known to match the expressive power of the 1-dimensional Weisfeiler-Leman graph (1-WL) isomorphism test. To boost the expressive power of MPNNs, a number of graph neural network architectures have recently been proposed based on higher-dimensional Weisfeiler-Leman tests. In this paper we consider the two-dimensional (2-WL) test and introduce a new type of MPNNs, referred to as \ell-walk MPNNs, which aggregate features along walks of length \ell between vertices. We show that 22-walk MPNNs match 2-WL in expressive power. More generally, \ell-walk MPNNs, for any 2\ell\geq 2, are shown to match the expressive power of the recently introduced \ell-walk refinement procedure (W[\ell]). Based on a correspondence between 2-WL and W[\ell], we observe that \ell-walk MPNNs and 22-walk MPNNs have the same expressive power, i.e., they can distinguish the same pairs of graphs, but \ell-walk MPNNs can possibly distinguish pairs of graphs faster than 22-walk MPNNs. When it comes to concrete learnable graph neural network (GNN) formalisms that match 2-WL or W[\ell] in expressive power, we consider second-order graph neural networks that allow for non-linear layers. In particular, to match W[\ell] in expressive power, we allow 1\ell-1 matrix multiplications in each layer. We propose different versions of second-order GNNs depending on the type of features (i.e., coming from a countable set, or coming from an uncountable set) as this affects the number of dimensions needed to represent the features. Our results indicate that increasing non-linearity in layers by means of allowing multiple matrix multiplications does not increase expressive power. At the very best, it results in a faster distinction of input graphs.

View on arXiv
Comments on this paper