Long-range interactions are essential for the correct description of complex systems in many scientific fields. The price to pay for including them in the calculations, however, is a dramatic increase in the overall computational costs. Recently, deep graph networks have been employed as efficient, data-driven models for predicting properties of complex systems represented as graphs. These models rely on a message passing strategy that should, in principle, capture long-range information without explicitly modeling the corresponding interactions. In practice, most deep graph networks cannot really model long-range dependencies due to the intrinsic limitations of (synchronous) message passing, namely oversmoothing, oversquashing, and underreaching. This work proposes a general framework that learns to mitigate these limitations: within a variational inference framework, we endow message passing architectures with the ability to adapt their depth and filter messages along the way. With theoretical and empirical arguments, we show that this strategy better captures long-range interactions, by competing with the state of the art on five node and graph prediction datasets.
View on arXiv@article{errica2025_2312.16560, title={ Adaptive Message Passing: A General Framework to Mitigate Oversmoothing, Oversquashing, and Underreaching }, author={ Federico Errica and Henrik Christiansen and Viktor Zaverkin and Takashi Maruyama and Mathias Niepert and Francesco Alesiani }, journal={arXiv preprint arXiv:2312.16560}, year={ 2025 } }