Skip to main content

ORIGINAL RESEARCH article

Front. Robot. AI
Sec. Human-Robot Interaction
Volume 11 - 2024 | doi: 10.3389/frobt.2024.1384610
This article is part of the Research Topic Perceiving, Generating, and Interpreting Affect in Human-Robot Interaction (HRI) View all 5 articles

Customisation's Impact on Strengthening Affective Bonds and Decision-Making with Socially Assistive Robots

Provisionally accepted
  • 1 University of Bristol, Bristol, United Kingdom
  • 2 Kempten University of Applied Sciences, Kempten, Germany
  • 3 University of the West of England, Bristol, England, United Kingdom

The final, formatted version of the article will be published soon.

    This study aims to fill a gap in understanding how customising robots can affect how humans interact with them, specifically regarding human decision-making and robot perception. The study focused on the robot's ability to persuade participants to follow its suggestions within the Balloon Analogue Risk Task (BART), where participants were challenged to balance the risk of bursting a virtual balloon against the potential reward of inflating it further.A between-subjects design was used, involving 62 participants divided evenly between customised or non-customised robot conditions. Compliance, risk-taking, reaction time, and perceptions of the robot's likability, intelligence, trustworthiness, and ownership were measured using quantitative and qualitative methods.The results showed that there were no significant differences in compliance or risk-taking behaviours between customised and non-customised robots. However, participants in the customised condition reported a significant increase in perceived ownership. Additionally, reaction times were longer in the customised condition, particularly for the "collect" suggestion. These results indicate that although customisation may not directly affect compliance or risk-taking, it enhances cognitive engagement and personal connection with robots.Regardless of customisation, the presence of a robot significantly influenced risk-taking behaviours, supporting theories of over-trust in robots and the automation bias. These findings highlight the importance of carefully considering ethical design and effective communication strategies when developing socially assistive robots to manage user trust and expectations, particularly in applications where behavioural influence is involved.

    Keywords: Customisation in Robotics, personalised robots, Affective Bonds in HRI, Trust in Robotics, Decision-Making in HRI, Socially Assistive Robots (SAR), Human-Robot Interaction (HRI, Persuasive Robots

    Received: 09 Feb 2024; Accepted: 03 Sep 2024.

    Copyright: © 2024 Ahmed, Giuliani, Leonards and Bremner. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Mohammed Shabaj Ahmed, University of Bristol, Bristol, United Kingdom

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.