Exploring Graph-Transformer Out-of-Distribution Generalization Abilities
- OOD

Main:11 Pages
7 Figures
Bibliography:4 Pages
4 Tables
Appendix:3 Pages
Abstract
Deep learning on graphs has shown remarkable success across numerous applications, including social networks, bio-physics, traffic networks, and recommendation systems. Regardless of their successes, current methods frequently depend on the assumption that training and testing data share the same distribution, a condition rarely met in real-world scenarios. While graph-transformer (GT) backbones have recently outperformed traditional message-passing neural networks (MPNNs) in multiple in-distribution (ID) benchmarks, their effectiveness under distribution shifts remains largely unexplored.
View on arXivComments on this paper