ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.01203
31
0

Hypergraph Foundation Model

3 March 2025
Yifan Feng
Shiquan Liu
Xiangmin Han
Shaoyi Du
Zongze Wu
Han Hu
Yue Gao
    AI4CE
ArXivPDFHTML
Abstract

Hypergraph neural networks (HGNNs) effectively model complex high-order relationships in domains like protein interactions and social networks by connecting multiple vertices through hyperedges, enhancing modeling capabilities, and reducing information loss. Developing foundation models for hypergraphs is challenging due to their distinct data, which includes both vertex features and intricate structural information. We present Hyper-FM, a Hypergraph Foundation Model for multi-domain knowledge extraction, featuring Hierarchical High-Order Neighbor Guided Vertex Knowledge Embedding for vertex feature representation and Hierarchical Multi-Hypergraph Guided Structural Knowledge Extraction for structural information. Additionally, we curate 10 text-attributed hypergraph datasets to advance research between HGNNs and LLMs. Experiments on these datasets show that Hyper-FM outperforms baseline methods by approximately 13.3\%, validating our approach. Furthermore, we propose the first scaling law for hypergraph foundation models, demonstrating that increasing domain diversity significantly enhances performance, unlike merely augmenting vertex and hyperedge counts. This underscores the critical role of domain diversity in scaling hypergraph models.

View on arXiv
@article{feng2025_2503.01203,
  title={ Hypergraph Foundation Model },
  author={ Yifan Feng and Shiquan Liu and Xiangmin Han and Shaoyi Du and Zongze Wu and Han Hu and Yue Gao },
  journal={arXiv preprint arXiv:2503.01203},
  year={ 2025 }
}
Comments on this paper