ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.10459
36
0

LLM4GNAS: A Large Language Model Based Toolkit for Graph Neural Architecture Search

12 February 2025
Yang Gao
Hong Yang
Y. Chen
Junxian Wu
Peng Zhang
Haishuai Wang
ArXivPDFHTML
Abstract

Graph Neural Architecture Search (GNAS) facilitates the automatic design of Graph Neural Networks (GNNs) tailored to specific downstream graph learning tasks. However, existing GNAS approaches often require manual adaptation to new graph search spaces, necessitating substantial code optimization and domain-specific knowledge. To address this challenge, we present LLM4GNAS, a toolkit for GNAS that leverages the generative capabilities of Large Language Models (LLMs). LLM4GNAS includes an algorithm library for graph neural architecture search algorithms based on LLMs, enabling the adaptation of GNAS methods to new search spaces through the modification of LLM prompts. This approach reduces the need for manual intervention in algorithm adaptation and code modification. The LLM4GNAS toolkit is extensible and robust, incorporating LLM-enhanced graph feature engineering, LLM-enhanced graph neural architecture search, and LLM-enhanced hyperparameter optimization. Experimental results indicate that LLM4GNAS outperforms existing GNAS methods on tasks involving both homogeneous and heterogeneous graphs.

View on arXiv
@article{gao2025_2502.10459,
  title={ LLM4GNAS: A Large Language Model Based Toolkit for Graph Neural Architecture Search },
  author={ Yang Gao and Hong Yang and Yizhi Chen and Junxian Wu and Peng Zhang and Haishuai Wang },
  journal={arXiv preprint arXiv:2502.10459},
  year={ 2025 }
}
Comments on this paper