ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02573
30
0

Rethinking Federated Graph Learning: A Data Condensation Perspective

5 May 2025
Hao Zhang
Xunkai Li
Y. X. Zhu
Lianglin Hu
    FedML
    DD
    AI4CE
ArXivPDFHTML
Abstract

Federated graph learning is a widely recognized technique that promotes collaborative training of graph neural networks (GNNs) by multi-clientthis http URL, existing approaches heavily rely on the communication of model parameters or gradients for federated optimization and fail to adequately address the data heterogeneity introduced by intricate and diverse graph distributions. Although some methods attempt to share additional messages among the server and clients to improve federated convergence during communication, they introduce significant privacy risks and increase communication overhead. To address these issues, we introduce the concept of a condensed graph as a novel optimization carrier to address FGL data heterogeneity and propose a new FGL paradigm called FedGM. Specifically, we utilize a generalized condensation graph consensus to aggregate comprehensive knowledge from distributed graphs, while minimizing communication costs and privacy risks through a single transmission of the condensed data. Extensive experiments on six public datasets consistently demonstrate the superiority of FedGM over state-of-the-art baselines, highlighting its potential for a novel FGL paradigm.

View on arXiv
@article{zhang2025_2505.02573,
  title={ Rethinking Federated Graph Learning: A Data Condensation Perspective },
  author={ Hao Zhang and Xunkai Li and Yinlin Zhu and Lianglin Hu },
  journal={arXiv preprint arXiv:2505.02573},
  year={ 2025 }
}
Comments on this paper