Implicit graph neural networks (IGNNs), which exhibit strong expressive power with a single layer, have recently demonstrated remarkable performance in capturing long-range dependencies (LRD) in underlying graphs while effectively mitigating the over-smoothing problem. However, IGNNs rely on computationally expensive fixed-point iterations, which lead to significant speed and scalability limitations, hindering their application to large-scale graphs. To achieve fast fixed-point solving for IGNNs, we propose a novel graph neural solver, IGNN-Solver, which leverages the generalized Anderson Acceleration method, parameterized by a tiny GNN, and learns iterative updates as a graph-dependent temporal process. To improve effectiveness on large-scale graph tasks, we further integrate sparsification and storage compression methods, specifically tailored for the IGNN-Solver, into its design. Extensive experiments demonstrate that the IGNN-Solver significantly accelerates inference on both small- and large-scale tasks, achieving a to speedup without sacrificing accuracy. This advantage becomes more pronounced as the graph scale grows, facilitating its large-scale deployment in real-world applications. The code to reproduce our results is available atthis https URL.
View on arXiv@article{lin2025_2410.08524, title={ IGNN-Solver: A Graph Neural Solver for Implicit Graph Neural Networks }, author={ Junchao Lin and Zenan Ling and Zhanbo Feng and Jingwen Xu and Minxuan Liao and Feng Zhou and Tianqi Hou and Zhenyu Liao and Robert C. Qiu }, journal={arXiv preprint arXiv:2410.08524}, year={ 2025 } }