Omar Boursalie, Reza Samavi, and Thomas E Doyle (2021)
Decoder Transformer for Temporally-Embedded Health Outcome Predictions
In: 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 1461-1467, IEEE, IEEE.
Deep learning models are increasingly being used to predict patients’ diagnoses by analyzing electronic health records. Medical records represent observations of a patient’s health over time. A commonly used approach to analyze health records is to encode them as a sequence of ordered diagnoses (diagnostic-level encoding). Transformer models then analyze the sequence of diagnoses to learn disease patterns. However, the elapsed time between medical visits is not considered when transformers are used to analyze health records. In this paper, we present DT-THRE: Decoder Transformer for Temporally-Embedded Health Records Encoding that predicts patients’ diagnoses by analyzing their medical histories. In DTTHRE, instead of diagnostic-level encoding, we propose an encoding representation for health records called THRE: Temporally-Embedded Health Records Encoding. THRE encodes patient histories as a sequence of medical events such as age, sex, and diagnostic embedding while incorporating the elapsed time between visits. We evaluate a proof-of-concept DTTHRE on a real-world medical dataset and compare our model’s performance to an existing diagnostic transformer model in the literature. DTTHRE was successful on a medical dataset to predict patients’ final diagnosis with improved predictive performance (78.54± 0.22%) compared to the existing model in the literature (40.51± 0.13%).
Document Actions