ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.19670
65
0

Training Robust Graph Neural Networks by Modeling Noise Dependencies

27 February 2025
Yeonjun In
Kanghoon Yoon
Sukwon Yun
Kibum Kim
Sungchul Kim
Chanyoung Park
    OOD
    NoLa
ArXivPDFHTML
Abstract

In real-world applications, node features in graphs often contain noise from various sources, leading to significant performance degradation in GNNs. Although several methods have been developed to enhance robustness, they rely on the unrealistic assumption that noise in node features is independent of the graph structure and node labels, thereby limiting their applicability. To this end, we introduce a more realistic noise scenario, dependency-aware noise on graphs (DANG), where noise in node features create a chain of noise dependencies that propagates to the graph structure and node labels. We propose a novel robust GNN, DA-GNN, which captures the causal relationships among variables in the data generating process (DGP) of DANG using variational inference. In addition, we present new benchmark datasets that simulate DANG in real-world applications, enabling more practical research on robust GNNs. Extensive experiments demonstrate that DA-GNN consistently outperforms existing baselines across various noise scenarios, including both DANG and conventional noise models commonly considered in this field.

View on arXiv
@article{in2025_2502.19670,
  title={ Training Robust Graph Neural Networks by Modeling Noise Dependencies },
  author={ Yeonjun In and Kanghoon Yoon and Sukwon Yun and Kibum Kim and Sungchul Kim and Chanyoung Park },
  journal={arXiv preprint arXiv:2502.19670},
  year={ 2025 }
}
Comments on this paper