AUTHOR=Dai Hong-Jie , Su Chu-Hsien , Lee You-Qian , Zhang You-Chen , Wang Chen-Kai , Kuo Chian-Jue , Wu Chi-Shin TITLE=Deep Learning-Based Natural Language Processing for Screening Psychiatric Patients JOURNAL=Frontiers in Psychiatry VOLUME=11 YEAR=2021 URL=https://www.frontiersin.org/journals/psychiatry/articles/10.3389/fpsyt.2020.533949 DOI=10.3389/fpsyt.2020.533949 ISSN=1664-0640 ABSTRACT=

The introduction of pre-trained language models in natural language processing (NLP) based on deep learning and the availability of electronic health records (EHRs) presents a great opportunity to transfer the “knowledge” learned from data in the general domain to enable the analysis of unstructured textual data in clinical domains. This study explored the feasibility of applying NLP to a small EHR dataset to investigate the power of transfer learning to facilitate the process of patient screening in psychiatry. A total of 500 patients were randomly selected from a medical center database. Three annotators with clinical experience reviewed the notes to make diagnoses for major/minor depression, bipolar disorder, schizophrenia, and dementia to form a small and highly imbalanced corpus. Several state-of-the-art NLP methods based on deep learning along with pre-trained models based on shallow or deep transfer learning were adapted to develop models to classify the aforementioned diseases. We hypothesized that the models that rely on transferred knowledge would be expected to outperform the models learned from scratch. The experimental results demonstrated that the models with the pre-trained techniques outperformed the models without transferred knowledge by micro-avg. and macro-avg. F-scores of 0.11 and 0.28, respectively. Our results also suggested that the use of the feature dependency strategy to build multi-labeling models instead of problem transformation is superior considering its higher performance and simplicity in the training process.