72
0

Recent Advances in Hypergraph Neural Networks

Abstract

The growing interest in hypergraph neural networks (HGNNs) is driven by their capacity to capture the complex relationships and patterns within hypergraph structured data across various domains, including computer vision, complex networks, and natural language processing. This paper comprehensively reviews recent advances in HGNNs and presents a taxonomy of mainstream models based on their architectures: hypergraph convolutional networks (HGCNs), hypergraph attention networks (HGATs), hypergraph autoencoders (HGAEs), hypergraph recurrent networks (HGRNs), and deep hypergraph generative models (DHGGMs). For each category, we delve into its practical applications, mathematical mechanisms, literature contributions, and open problems. Finally, we discuss some common challenges and promising researchthis http URLpaper aspires to be a helpful resource that provides guidance for future research and applications of HGNNs.

View on arXiv
@article{yang2025_2503.07959,
  title={ Recent Advances in Hypergraph Neural Networks },
  author={ Murong Yang and Xin-Jian Xu },
  journal={arXiv preprint arXiv:2503.07959},
  year={ 2025 }
}
Comments on this paper