AUTHOR=Shickel Benjamin , Silva Brandon , Ozrazgat-Baslanti Tezcan , Ren Yuanfang , Khezeli Kia , Guan Ziyuan , Tighe Patrick J. , Bihorac Azra , Rashidi Parisa TITLE=Multi-dimensional patient acuity estimation with longitudinal EHR tokenization and flexible transformer networks JOURNAL=Frontiers in Digital Health VOLUME=4 YEAR=2022 URL=https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2022.1029191 DOI=10.3389/fdgth.2022.1029191 ISSN=2673-253X ABSTRACT=

Transformer model architectures have revolutionized the natural language processing (NLP) domain and continue to produce state-of-the-art results in text-based applications. Prior to the emergence of transformers, traditional NLP models such as recurrent and convolutional neural networks demonstrated promising utility for patient-level predictions and health forecasting from longitudinal datasets. However, to our knowledge only few studies have explored transformers for predicting clinical outcomes from electronic health record (EHR) data, and in our estimation, none have adequately derived a health-specific tokenization scheme to fully capture the heterogeneity of EHR systems. In this study, we propose a dynamic method for tokenizing both discrete and continuous patient data, and present a transformer-based classifier utilizing a joint embedding space for integrating disparate temporal patient measurements. We demonstrate the feasibility of our clinical AI framework through multi-task ICU patient acuity estimation, where we simultaneously predict six mortality and readmission outcomes. Our longitudinal EHR tokenization and transformer modeling approaches resulted in more accurate predictions compared with baseline machine learning models, which suggest opportunities for future multimodal data integrations and algorithmic support tools using clinical transformer networks.