Skip to main content

BRIEF RESEARCH REPORT article

Front. Comput. Sci.
Sec. Human-Media Interaction
Volume 6 - 2024 | doi: 10.3389/fcomp.2024.1499165
This article is part of the Research Topic Human-Centered Artificial Intelligence in Interaction Processes View all 7 articles

Investigating the Impacts of Auditory and Visual Feedback in Advanced Driver Assistance Systems: A Pilot Study on Driver Behavior and Emotional Response

Provisionally accepted
  • 1 School of Computer, Data and Mathematical Sciences, Western Sydney University, Campbelltown, New South Wales, Australia
  • 2 School of Business, Western Sydney University, Penrith, New South Wales, Australia
  • 3 College of Information Technology, United Arab Emirates University, AlAin, Abu Dhabi, United Arab Emirates

The final, formatted version of the article will be published soon.

    In the autonomous vehicle industry, Advanced Driver Assistance Systems (ADAS) are recognized for their capacity to enhance service quality, improve on-road safety, and increase driver comfort. Driver Assistance Systems are able to provide multi-modal feedback including auditory cues, visual cues, vibrotactile cues and so on. The study will concentrate on assessing the impacts of auditory and visual feedback from assistive driving systems on drivers. A group consisting of five participants (N=5) was recruited to take part in two sets of driving experiments. During the experimental sessions, they were exposed to several reminders designed for drivers in audio-only format and audio-visual format respectively. Their driving behaviors and performances were under researcher's observation, while their emotions were evaluated by YOLO v5 detecting model. The results reveal that the participants higher compliance rate and strong emotional reactions (especially the feelings of anger, sadness and surprise) towards the unimodal feedback of audio-only driving reminders. There is no strong evidence showing that the bimodal ADAS feedback of audio-visual cues effectively improve drivers' performance during driving period. However, both the emotion data and user satisfaction results indicate that participants experienced an increase in feelings of happiness when they were able to visualize the AI assistant while hearing the audio reminders from the assistant. The study serves as one of the pioneering studies aimed at enhancing the theoretical foundation in the field of automotive user interface design, particularly concerning the design of auditory functions.

    Keywords: Robot voice, Emotion Detecting, Advanced driver assistance systems, human-robot interaction, Humanoid agents

    Received: 20 Sep 2024; Accepted: 27 Dec 2024.

    Copyright: © 2024 Zou, Khan, Lwin, Alnajjar and Mubin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Aila Khan, School of Business, Western Sydney University, Penrith, NSW 2751, New South Wales, Australia

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.