Skip to main content

ORIGINAL RESEARCH article

Front. Psychiatry
Sec. Computational Psychiatry
Volume 15 - 2024 | doi: 10.3389/fpsyt.2024.1462083
This article is part of the Research Topic Mental Health in the Age of Artificial Intelligence View all 3 articles

The Ethical Aspects of Integrating Sentiment and Emotion Analysis in Chatbots for Depression Intervention

Provisionally accepted
  • 1 Bern University of Applied Sciences, Bern, Switzerland
  • 2 Østfold University College, Halden, Østfold, Norway

The final, formatted version of the article will be published soon.

    Digital health interventions specifically those realized as chatbots are increasingly available for mental health. They include technologies based on artificial intelligence that assess user's sentiment and emotions for the purpose of responding in an empathetic way, or for treatment purposes, e.g. for analyzing the expressed emotions and suggesting interventions. In this paper, we study the ethical dimensions of integrating these technologies in chatbots for depression intervention using the digital ethics canvas and the DTx Risk Assessment Canvas. As result, we identified some specific risks associated with the integration of sentiment and emotion analysis methods into these systems related to the difficulty to recognize correctly the expressed sentiment or emotion from statements of individuals with depressive symptoms and the appropriate system reaction including risk detection. Depending on the realization of the sentiment or emotion analysis, which might be dictionary-based or machine-learning based, additional risks occur from biased training data or misinterpretations. While technology decisions during system development can be made carefully depending on the use case, other ethical risks cannot be prevented on a technical level, but by carefully integrating such chatbots into the care process allowing for supervision by health professionals. We conclude that a careful reflection is needed when integrating sentiment and emotion analysis into chatbots for depression intervention. Balancing risk factors is key to leveraging technology in mental health in a way that enhances, rather than diminishes, user autonomy and agency.

    Keywords: Depression, Mental Health, emotion, sentiment, emotion analysis, Chatbot, conversational agent, ethics Ethics of Emotion Analysis in Chatbots

    Received: 09 Jul 2024; Accepted: 18 Oct 2024.

    Copyright: © 2024 Denecke and Gabarron. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Elia Gabarron, Østfold University College, Halden, 1757, Østfold, Norway

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.