AUTHOR=Horbach Andrea , Zesch Torsten TITLE=The Influence of Variance in Learner Answers on Automatic Content Scoring JOURNAL=Frontiers in Education VOLUME=4 YEAR=2019 URL=https://www.frontiersin.org/journals/education/articles/10.3389/feduc.2019.00028 DOI=10.3389/feduc.2019.00028 ISSN=2504-284X ABSTRACT=

Automatic content scoring is an important application in the area of automatic educational assessment. Short texts written by learners are scored based on their content while spelling and grammar mistakes are usually ignored. The difficulty of automatically scoring such texts varies according to the variance within the learner answers. In this paper, we first discuss factors that influence variance in learner answers, so that practitioners can better estimate if automatic scoring might be applicable to their usage scenario. We then compare the two main paradigms in content scoring: (i) similarity-based and (ii) instance-based methods, and discuss how well they can deal with each of the variance-inducing factors described before.