19
0

BioMARS: A Multi-Agent Robotic System for Autonomous Biological Experiments

Yibo Qiu
Zan Huang
Zhiyu Wang
Handi Liu
Yiling Qiao
Yifeng Hu
Shuáng Sun
Hangke Peng
Ronald X Xu
Mingzhai Sun
Main:15 Pages
8 Figures
Bibliography:2 Pages
1 Tables
Appendix:11 Pages
Abstract

Large language models (LLMs) and vision-language models (VLMs) have the potential to transform biological research by enabling autonomous experimentation. Yet, their application remains constrained by rigid protocol design, limited adaptability to dynamic lab conditions, inadequate error handling, and high operational complexity. Here we introduce BioMARS (Biological Multi-Agent Robotic System), an intelligent platform that integrates LLMs, VLMs, and modular robotics to autonomously design, plan, and execute biological experiments. BioMARS uses a hierarchical architecture: the Biologist Agent synthesizes protocols via retrieval-augmented generation; the Technician Agent translates them into executable robotic pseudo-code; and the Inspector Agent ensures procedural integrity through multimodal perception and anomaly detection. The system autonomously conducts cell passaging and culture tasks, matching or exceeding manual performance in viability, consistency, and morphological integrity. It also supports context-aware optimization, outperforming conventional strategies in differentiating retinal pigment epithelial cells. A web interface enables real-time human-AI collaboration, while a modular backend allows scalable integration with laboratory hardware. These results highlight the feasibility of generalizable, AI-driven laboratory automation and the transformative role of language-based reasoning in biological research.

View on arXiv
@article{qiu2025_2507.01485,
  title={ BioMARS: A Multi-Agent Robotic System for Autonomous Biological Experiments },
  author={ Yibo Qiu and Zan Huang and Zhiyu Wang and Handi Liu and Yiling Qiao and Yifeng Hu and Shuáng Sun and Hangke Peng and Ronald X Xu and Mingzhai Sun },
  journal={arXiv preprint arXiv:2507.01485},
  year={ 2025 }
}
Comments on this paper