191
v1v2v3 (latest)

Higher-Order Graphon Neural Networks: Approximation and Cut Distance

International Conference on Learning Representations (ICLR), 2025
Main:10 Pages
6 Figures
Bibliography:4 Pages
2 Tables
Appendix:39 Pages
Abstract

Graph limit models, like graphons for limits of dense graphs, have recently been used to study size transferability of graph neural networks (GNNs). While most literature focuses on message passing GNNs (MPNNs), in this work we attend to the more powerful higher-order GNNs. First, we extend the kk-WL test for graphons (Böker, 2023) to the graphon-signal space and introduce signal-weighted homomorphism densities as a key tool. As an exemplary focus, we generalize Invariant Graph Networks (IGNs) to graphons, proposing Invariant Graphon Networks (IWNs) defined via a subset of the IGN basis corresponding to bounded linear operators. Even with this restricted basis, we show that IWNs of order kk are at least as powerful as the kk-WL test, and we establish universal approximation results for graphon-signals in LpL^p distances. This significantly extends the prior work of Cai & Wang (2022), showing that IWNs--a subset of their IGN-small--retain effectively the same expressivity as the full IGN basis in the limit. In contrast to their approach, our blueprint of IWNs also aligns better with the geometry of graphon space, for example facilitating comparability to MPNNs. We highlight that, while typical higher-order GNNs are discontinuous w.r.t. cut distance--which causes their lack of convergence and is inherently tied to the definition of kk-WL--transferability remains achievable.

View on arXiv
Comments on this paper