54
0

GrokFormer: Graph Fourier Kolmogorov-Arnold Transformers

Abstract

Graph Transformers (GTs) have demonstrated remarkable performance in graph representation learning over popular graph neural networks (GNNs). However, self--attention, the core module of GTs, preserves only low-frequency signals in graph features, leading to ineffectiveness in capturing other important signals like high-frequency ones. Some recent GT models help alleviate this issue, but their flexibility and expressiveness are still limited since the filters they learn are fixed on predefined graph spectrum or order. To tackle this challenge, we propose a Graph Fourier Kolmogorov-Arnold Transformer (GrokFormer), a novel GT model that learns highly expressive spectral filters with adaptive graph spectrum and order through a Fourier series modeling over learnable activation functions. We demonstrate theoretically and empirically that the proposed GrokFormer filter offers better expressiveness than other spectral methods. Comprehensive experiments on 10 real-world node classification datasets across various domains, scales, and graph properties, as well as 5 graph classification datasets, show that GrokFormer outperforms state-of-the-art GTs and GNNs. Our code is available atthis https URL

View on arXiv
@article{ai2025_2411.17296,
  title={ GrokFormer: Graph Fourier Kolmogorov-Arnold Transformers },
  author={ Guoguo Ai and Guansong Pang and Hezhe Qiao and Yuan Gao and Hui Yan },
  journal={arXiv preprint arXiv:2411.17296},
  year={ 2025 }
}
Comments on this paper