ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.00418
79
0

Mixture of Experts for Node Classification

30 November 2024
Yu Shi
Yiqi Wang
WeiXuan Lang
Jiaxin Zhang
Pan Dong
Aiping Li
ArXivPDFHTML
Abstract

Nodes in the real-world graphs exhibit diverse patterns in numerous aspects, such as degree and homophily. However, most existent node predictors fail to capture a wide range of node patterns or to make predictions based on distinct node patterns, resulting in unsatisfactory classification performance. In this paper, we reveal that different node predictors are good at handling nodes with specific patterns and only apply one node predictor uniformly could lead to suboptimal result. To mitigate this gap, we propose a mixture of experts framework, MoE-NP, for node classification. Specifically, MoE-NP combines a mixture of node predictors and strategically selects models based on node patterns. Experimental results from a range of real-world datasets demonstrate significant performance improvements from MoE-NP.

View on arXiv
@article{shi2025_2412.00418,
  title={ Mixture of Experts for Node Classification },
  author={ Yu Shi and Yiqi Wang and WeiXuan Lang and Jiaxin Zhang and Pan Dong and Aiping Li },
  journal={arXiv preprint arXiv:2412.00418},
  year={ 2025 }
}
Comments on this paper