ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.11301
18
0

Learning to Be A Doctor: Searching for Effective Medical Agent Architectures

15 April 2025
Yangyang Zhuang
Wenjia Jiang
J. Zhang
Ze Yang
Joey Tianyi Zhou
Chi Zhang
    AI4CE
ArXivPDFHTML
Abstract

Large Language Model (LLM)-based agents have demonstrated strong capabilities across a wide range of tasks, and their application in the medical domain holds particular promise due to the demand for high generalizability and reliance on interdisciplinary knowledge. However, existing medical agent systems often rely on static, manually crafted workflows that lack the flexibility to accommodate diverse diagnostic requirements and adapt to emerging clinical scenarios. Motivated by the success of automated machine learning (AutoML), this paper introduces a novel framework for the automated design of medical agent architectures. Specifically, we define a hierarchical and expressive agent search space that enables dynamic workflow adaptation through structured modifications at the node, structural, and framework levels. Our framework conceptualizes medical agents as graph-based architectures composed of diverse, functional node types and supports iterative self-improvement guided by diagnostic feedback. Experimental results on skin disease diagnosis tasks demonstrate that the proposed method effectively evolves workflow structures and significantly enhances diagnostic accuracy over time. This work represents the first fully automated framework for medical agent architecture design and offers a scalable, adaptable foundation for deploying intelligent agents in real-world clinical environments.

View on arXiv
@article{zhuang2025_2504.11301,
  title={ Learning to Be A Doctor: Searching for Effective Medical Agent Architectures },
  author={ Yangyang Zhuang and Wenjia Jiang and Jiayu Zhang and Ze Yang and Joey Tianyi Zhou and Chi Zhang },
  journal={arXiv preprint arXiv:2504.11301},
  year={ 2025 }
}
Comments on this paper