Research has identified simulation-based training with chatbots and virtual avatars as an effective educational strategy in some domains, such as medicine and mental health disciplines. Several studies on interactive systems have also suggested that user experience is decisive for adoption. As interest increases, it becomes important to examine the factors influencing user acceptance and trust in simulation-based training systems, and to validate applicability to specific learning tasks. The aim of this research is twofold: (1) to examine the perceived acceptance and trust in a risk assessment training chatbot developed to help students assess risk and needs of juvenile offenders, and (2) to examine the factors influencing students’ perceptions of acceptance and trust.
Participants were 112 criminology students in an undergraduate course in a Canadian university. Participants were directed to use a custom-designed chatbot with a virtual 3D avatar for juvenile offenders’ risk assessment training, to complete online questionnaires and a risk assessment exercise.
Results show satisfactory levels of acceptance and trust in the chatbot. Concerning acceptance, more than half appeared to be satisfied or very satisfied with the chatbot, while most participants appeared to be neutral or satisfied with the benevolence and credibility of the chatbot.
Results suggest that acceptance and trust do not only depend on the design of the chatbot software, but also on the characteristics of the user, and most prominently on self-efficacy, state anxiety, learning styles and neuroticism personality traits. As trust and acceptance play a vital role in determining technology success, these results are encouraging.