Skip to main content

EDITORIAL article

Front. Comput. Sci., 13 December 2023
Sec. Human-Media Interaction
This article is part of the Research Topic Technology For the Greater Good? The Influence of (Ir)responsible Systems on Human Emotions, Thinking and Behavior View all 11 articles

Editorial: Technology for the greater good? The influence of (ir)responsible systems on human emotions, thinking and behavior

  • 1School of Design and Informatics, Abertay University, Dundee, United Kingdom
  • 2School of Computing, University of Kent, Canterbury, United Kingdom
  • 3Human-Computer Interaction Center, RWTH Aachen University, Aachen, Germany

Advanced technological systems have a tremendous impact on our lives, organizations, and societies (Stephanidis et al., 2019). We may be checking our communications on our smart phones in the morning, accessing our social media sites and operating smart systems on-the-go, while reading the latest articles based on AI-powered recommendations. Arguably every aspect of our lives is entangled with technology, be it how we communicate and interact with each other, how we entertain ourselves, how we maintain our households, safety, security, and wellbeing, how we manage our resources, or how we travel, work, or educate ourselves. Like never before, social media platforms provide a universal means of networking with others, spanning the globe, with immediate impact, and their access is only limited by cultural or political frontiers. Digital characters and robots provide us with social support and companionship, thus exceeding their traditional role of providing utilitarian value. In line with Reeves and Nass's (1996) seminal book “The Media Equation,” the notion of computers as social actors continues to inspire future work, especially as digital entities appear to behave like sentient beings in increasingly sophisticated ways. Our decision-making may be influenced by recommender systems or social media, and smart systems take over tasks that we ourselves performed in the past. The ubiquitous presence of technology systems affects societies in many ways, raising interesting philosophical perspectives (Van de Poel, 2020).

Against this background, research on human-technology interactions has barely kept pace, resulting in numerous articles published concurrently with new technological inventions. Technology use is inextricably linked with some form of user experience. However, the consequences of human-technology interactions are not always clearcut. To the extent that we shape technology, it shapes us, and often in unanticipated ways. AI, due to its malleable nature, can be compromised, e.g., as in the case of Microsoft's chat bot “Tay,” which produced inflammatory posts within 1 day of usage (Neff and Nagy, 2016). There is well-documented evidence regarding social media use and its (adverse) effects on psychological wellbeing (Twenge, 2019). It will be of interest to observe whether/how law and policy makers regulate the design of choice architectures and monitor technology-induced incidents in the future to prevent harm. With the proliferation of such technologies, it can be argued that new challenges, but also opportunities, will arise for human (and non-human) users of such systems.

This Research Topic on “Technology for the greater good?” comprises a collection of papers exploring novel work in human-technology interaction, with the aim of identifying new challenge spaces and topics. The scope of the Research Topic, given the volume of innovations, is necessarily non-exhaustive, but aims to provide an overview of pertinent issues that affect current society and individuals.

The first group of studies explores perceived privacy and security aspects of technology. Belen-Saglam et al. studied disclosure of sensitive data and findings suggest that the use of conversational agents detrimentally affected disclosure in the health domain, but less so in the financial domain. Hildebrandt et al. explored users' privacy concerns in mobile forensics, with users showing a preference for the release of less personal data, such as geo-spatial data over more personal data such as photographs and favoring automated rather than human evaluations. Finally, Brauner et al. examined public perceptions on the use of AI, reporting that cybersecurity threats were perceived to be highly likely and least liked. People scoring higher on dispositional trust had more favorable views of AI compared to people with lower trust. The findings highlight the intricacies of user decision-making and user acceptance in relation to handling sensitive material, which should be considered by creators of technology.

The second group of contributions investigates user experiences or behavior when interacting with robots or robot process automation. Employing a lab-based study, Maalouly et al. reported that users were more altruistic toward a humanoid robot after sustained conversation, which suggests that anthropomorphic technology can elicit pro-social behavior. Filgueiras et al. showed that, while multi-faceted, the user experience after prolonged use with robot process automation reflected user acceptance and adoption, especially where automation provided utilitarian value to the user.

The final set of articles highlights technology interactions with people's dispositions, states, and experiences. In relation to procrastination, Sümer and Büttner reported that higher boredom proneness, lower self-control and lower perseverance were predictive of different types of online procrastination. With a focus on social media and mental health, Scarpulla et al. showed that more active social media use was associated with increased anxiety and stress as well as poorer emotion recognition skill, while passive social media use was not associated with these variables. Protzko and Schooler demonstrated that people were more inclined to view technological-societal shifts as corruptive of today's youth if they had not encountered this technology themselves during their formative years, pointing to the role that personal experience plays for beliefs about society and the technology it is exposed to. Xie and Liu demonstrated that trust in social media platforms relates to perceived information quality, perceived privacy, a sense of social belonging and sense of self-esteem, and positive emotion. The work by Kaminger et al. revealed that dispositional gratitude can act as a protective factor when using social media by moderating the relationship of social comparison, and malicious and general envy. The experiences of and interactions with human-centered technology are multi-faceted and give rise to equally diverse research findings.

In conclusion, we thank all involved in the preparation of this Research Topic, contributing from various disciplines, countries, and contexts. The contributions underscore the importance of considering user perceptions and experiences as pivotal factors in steering future human-technology innovations. We hope that these developments ultimately contribute to the creation of systems that assist and benefit their users and society, removing the question mark in the title of this Research Topic—in other words, in technology for the greater good.

Author contributions

AS: Conceptualization, Writing – original draft. LS: Writing – review & editing. JN: Writing – review & editing. PB: Writing – review & editing. MZ: Writing – review & editing.

Funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Neff, G., and Nagy, P. (2016). Talking to bots: symbiotic agency and the case of Tay. Int. J. Commun. 10, 4915–4931.

Google Scholar

Reeves, B., and Nass, C. (1996). The Media Equation: How People Treat Computers, Television, and New Media like Real People and Places. Cambridge: Cambridge University Press.

Google Scholar

Stephanidis, C., Salvendy, G., Antona, M., Chen, J. Y., Dong, J., Duffy, V. G., et al. (2019). Seven HCI grand challenges. Int. J. Human–Comput. Interact. 35, 1229–1269. doi: 10.1080/10447318.2019.1619259

CrossRef Full Text | Google Scholar

Twenge, J. M. (2019). More time on technology, less happiness? Associations between digital-media use and psychological well-being. Curr. Dir. Psychol. Sci. 28, 372–379. doi: 10.1177/0963721419838244

CrossRef Full Text | Google Scholar

Van de Poel, I. (2020). Three philosophical perspectives on the relation between technology and society, and how they affect the current debate about artificial intelligence. Human Affairs 30, 499–511. doi: 10.1515/humaff-2020-0042

CrossRef Full Text | Google Scholar

Keywords: human-centered technology, artificial intelligence, user experience, social media, robotics, user characteristics, security and privacy

Citation: Szymkowiak A, Shepherd LA, Nurse JRC, Brauner P and Ziefle M (2023) Editorial: Technology for the greater good? The influence of (ir)responsible systems on human emotions, thinking and behavior. Front. Comput. Sci. 5:1341692. doi: 10.3389/fcomp.2023.1341692

Received: 20 November 2023; Accepted: 22 November 2023;
Published: 13 December 2023.

Edited and reviewed by: Roberto Therón, University of Salamanca, Spain

Copyright © 2023 Szymkowiak, Shepherd, Nurse, Brauner and Ziefle. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Andrea Szymkowiak, a.szymkowiak@abertay.ac.uk

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.