ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.10276
30
0

FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation Models

17 August 2024
Xiaochen Wang
Jiaqi Wang
Houping Xiao
J. Chen
Fenglong Ma
    MedIm
ArXivPDFHTML
Abstract

Foundation models have demonstrated remarkable capabilities in handling diverse modalities and tasks, outperforming conventional artificial intelligence (AI) approaches that are highly task-specific and modality-reliant. In the medical domain, however, the development of comprehensive foundation models is constrained by limited access to diverse modalities and stringent privacy regulations. To address these constraints, this study introduces a novel knowledge injection approach, FedKIM, designed to scale the medical foundation model within a federated learning framework. FedKIM leverages lightweight local models to extract healthcare knowledge from private data and integrates this knowledge into a centralized foundation model using a designed adaptive Multitask Multimodal Mixture Of Experts (M3OE) module. This method not only preserves privacy but also enhances the model's ability to handle complex medical tasks involving multiple modalities. Our extensive experiments across twelve tasks in seven modalities demonstrate the effectiveness of FedKIM in various settings, highlighting its potential to scale medical foundation models without direct access to sensitive data.

View on arXiv
@article{wang2025_2408.10276,
  title={ FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation Models },
  author={ Xiaochen Wang and Jiaqi Wang and Houping Xiao and Jinghui Chen and Fenglong Ma },
  journal={arXiv preprint arXiv:2408.10276},
  year={ 2025 }
}
Comments on this paper