ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.08524
38
0

IGNN-Solver: A Graph Neural Solver for Implicit Graph Neural Networks

11 October 2024
Junchao Lin
Zenan Ling
Zhanbo Feng
Feng Zhou
Jingwen Xu
Feng Zhou
Tianqi Hou
Zhenyu Liao
Robert C. Qiu
    GNN
    AI4CE
ArXivPDFHTML
Abstract

Implicit graph neural networks (IGNNs), which exhibit strong expressive power with a single layer, have recently demonstrated remarkable performance in capturing long-range dependencies (LRD) in underlying graphs while effectively mitigating the over-smoothing problem. However, IGNNs rely on computationally expensive fixed-point iterations, which lead to significant speed and scalability limitations, hindering their application to large-scale graphs. To achieve fast fixed-point solving for IGNNs, we propose a novel graph neural solver, IGNN-Solver, which leverages the generalized Anderson Acceleration method, parameterized by a tiny GNN, and learns iterative updates as a graph-dependent temporal process. To improve effectiveness on large-scale graph tasks, we further integrate sparsification and storage compression methods, specifically tailored for the IGNN-Solver, into its design. Extensive experiments demonstrate that the IGNN-Solver significantly accelerates inference on both small- and large-scale tasks, achieving a 1.5×1.5\times1.5× to 8×8\times8× speedup without sacrificing accuracy. This advantage becomes more pronounced as the graph scale grows, facilitating its large-scale deployment in real-world applications. The code to reproduce our results is available atthis https URL.

View on arXiv
@article{lin2025_2410.08524,
  title={ IGNN-Solver: A Graph Neural Solver for Implicit Graph Neural Networks },
  author={ Junchao Lin and Zenan Ling and Zhanbo Feng and Jingwen Xu and Minxuan Liao and Feng Zhou and Tianqi Hou and Zhenyu Liao and Robert C. Qiu },
  journal={arXiv preprint arXiv:2410.08524},
  year={ 2025 }
}
Comments on this paper