ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.04120
21
0

Transformer representation learning is necessary for dynamic multi-modal physiological data on small-cohort patients

5 April 2025
Bingxu Wang
Kunzhi Cai
Yuqi Zhang
Yachong Guo
Zeyi Zhou
Yachong Guo
Yachong Guo
Wei Wang
Qing Zhou
    MedIm
ArXivPDFHTML
Abstract

Postoperative delirium (POD), a severe neuropsychiatric complication affecting nearly 50% of high-risk surgical patients, is defined as an acute disorder of attention and cognition, It remains significantly underdiagnosed in the intensive care units (ICUs) due to subjective monitoring methods. Early and accurate diagnosis of POD is critical and achievable. Here, we propose a POD prediction framework comprising a Transformer representation model followed by traditional machine learning algorithms. Our approaches utilizes multi-modal physiological data, including amplitude-integrated electroencephalography (aEEG), vital signs, electrocardiographic monitor data as well as hemodynamic parameters. We curated the first multi-modal POD dataset encompassing two patient types and evaluated the various Transformer architectures for representation learning. Empirical results indicate a consistent improvements of sensitivity and Youden index in patient TYPE I using Transformer representations, particularly our fusion adaptation of Pathformer. By enabling effective delirium diagnosis from postoperative day 1 to 3, our extensive experimental findings emphasize the potential of multi-modal physiological data and highlight the necessity of representation learning via multi-modal Transformer architecture in clinical diagnosis.

View on arXiv
@article{wang2025_2504.04120,
  title={ Transformer representation learning is necessary for dynamic multi-modal physiological data on small-cohort patients },
  author={ Bingxu Wang and Min Ge and Kunzhi Cai and Yuqi Zhang and Zeyi Zhou and Wenjiao Li and Yachong Guo and Wei Wang and Qing Zhou },
  journal={arXiv preprint arXiv:2504.04120},
  year={ 2025 }
}
Comments on this paper