ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.22108
212
0
v1v2v3 (latest)

Inclusive, Differentially Private Federated Learning for Clinical Data

28 May 2025
Santhosh Parampottupadam
Melih Coşğun
Sarthak Pati
M. Zenk
Saikat Roy
Dimitrios Bounias
Benjamin Hamm
Sinem Sav
R. Floca
Klaus H. Maier-Hein
    FedML
ArXiv (abs)PDFHTMLGithub (30251★)
Main:9 Pages
1 Figures
Bibliography:2 Pages
3 Tables
Abstract

Federated Learning (FL) offers a promising approach for training clinical AI models without centralizing sensitive patient data. However, its real-world adoption is hindered by challenges related to privacy, resource constraints, and compliance. Existing Differential Privacy (DP) approaches often apply uniform noise, which disproportionately degrades model performance, even among well-compliant institutions. In this work, we propose a novel compliance-aware FL framework that enhances DP by adaptively adjusting noise based on quantifiable client compliance scores. Additionally, we introduce a compliance scoring tool based on key healthcare and security standards to promote secure, inclusive, and equitable participation across diverse clinical settings. Extensive experiments on public datasets demonstrate that integrating under-resourced, less compliant clinics with highly regulated institutions yields accuracy improvements of up to 15% over traditional FL. This work advances FL by balancing privacy, compliance, and performance, making it a viable solution for real-world clinical workflows in global healthcare.

View on arXiv
Comments on this paper