ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.12684
90
0

Federated Variational Inference for Bayesian Mixture Models

18 February 2025
Jackie Rao
Francesca L. Crowe
Tom Marshall
S. Richardson
Paul D. W. Kirk
    FedML
ArXivPDFHTML
Abstract

We present a federated learning approach for Bayesian model-based clustering of large-scale binary and categorical datasets. We introduce a principled 'divide and conquer' inference procedure using variational inference with local merge and delete moves within batches of the data in parallel, followed by 'global' merge moves across batches to find global clustering structures. We show that these merge moves require only summaries of the data in each batch, enabling federated learning across local nodes without requiring the full dataset to be shared. Empirical results on simulated and benchmark datasets demonstrate that our method performs well in comparison to existing clustering algorithms. We validate the practical utility of the method by applying it to large scale electronic health record (EHR) data.

View on arXiv
@article{rao2025_2502.12684,
  title={ Federated Variational Inference for Bayesian Mixture Models },
  author={ Jackie Rao and Francesca L. Crowe and Tom Marshall and Sylvia Richardson and Paul D. W. Kirk },
  journal={arXiv preprint arXiv:2502.12684},
  year={ 2025 }
}
Comments on this paper