SOTAVerified

ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

2025-04-10Unverified0· sign in to hype

Yuanyun Zhang, Shi Li

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The temporal complexity of electronic health record (EHR) data presents significant challenges for predicting clinical outcomes using machine learning. This paper proposes ChronoFormer, an innovative transformer based architecture specifically designed to encode and leverage temporal dependencies in longitudinal patient data. ChronoFormer integrates temporal embeddings, hierarchical attention mechanisms, and domain specific masking techniques. Extensive experiments conducted on three benchmark tasks mortality prediction, readmission prediction, and long term comorbidity onset demonstrate substantial improvements over current state of the art methods. Furthermore, detailed analyses of attention patterns underscore ChronoFormer's capability to capture clinically meaningful long range temporal relationships.

Tasks

Reproductions