AUTHOR=Oommen Chelsea , Howlett-Prieto Quentin , Carrithers Michael D. , Hier Daniel B. TITLE=Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records JOURNAL=Frontiers in Digital Health VOLUME=5 YEAR=2023 URL=https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2023.1075771 DOI=10.3389/fdgth.2023.1075771 ISSN=2673-253X ABSTRACT=

The extraction of patient signs and symptoms recorded as free text in electronic health records is critical for precision medicine. Once extracted, signs and symptoms can be made computable by mapping to signs and symptoms in an ontology. Extracting signs and symptoms from free text is tedious and time-consuming. Prior studies have suggested that inter-rater agreement for clinical concept extraction is low. We have examined inter-rater agreement for annotating neurologic concepts in clinical notes from electronic health records. After training on the annotation process, the annotation tool, and the supporting neuro-ontology, three raters annotated 15 clinical notes in three rounds. Inter-rater agreement between the three annotators was high for text span and category label. A machine annotator based on a convolutional neural network had a high level of agreement with the human annotators but one that was lower than human inter-rater agreement. We conclude that high levels of agreement between human annotators are possible with appropriate training and annotation tools. Furthermore, more training examples combined with improvements in neural networks and natural language processing should make machine annotators capable of high throughput automated clinical concept extraction with high levels of agreement with human annotators.