ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.07373
32
0

ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

10 April 2025
Yuanyun Zhang
Shi Li
ArXivPDFHTML
Abstract

The temporal complexity of electronic health record (EHR) data presents significant challenges for predicting clinical outcomes using machine learning. This paper proposes ChronoFormer, an innovative transformer based architecture specifically designed to encode and leverage temporal dependencies in longitudinal patient data. ChronoFormer integrates temporal embeddings, hierarchical attention mechanisms, and domain specific masking techniques. Extensive experiments conducted on three benchmark tasks mortality prediction, readmission prediction, and long term comorbidity onset demonstrate substantial improvements over current state of the art methods. Furthermore, detailed analyses of attention patterns underscore ChronoFormer's capability to capture clinically meaningful long range temporal relationships.

View on arXiv
@article{zhang2025_2504.07373,
  title={ ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling },
  author={ Yuanyun Zhang and Shi Li },
  journal={arXiv preprint arXiv:2504.07373},
  year={ 2025 }
}
Comments on this paper