Skip to main content

ORIGINAL RESEARCH article

Front. Hum. Neurosci.
Sec. Cognitive Neuroscience
Volume 18 - 2024 | doi: 10.3389/fnhum.2024.1467403
This article is part of the Research Topic Investigating Learning and Cognitive States Using Multimodal Approaches View all articles

Understanding Emotional Influences on Sustained Attention: A Study Using Virtual Reality and Neurophysiological Monitoring

Provisionally accepted
  • 1 Beijing Normal University, Beijing, Beijing Municipality, China
  • 2 Vanderbilt University, Nashville, Tennessee, United States

The final, formatted version of the article will be published soon.

    Emotion and attention regulation significantly influence various aspects of human functioning and behavior. However, the interaction between emotion and attention in affecting performance remains underexplored. This study zooms in sustained attention, one aspect of attention relevant to task performance, i.e., persistent attention and control during the task, and investigates how individual differences in sustained attention, in response to emotional state variations, impact task performance. A total of 12 participants underwent emotion induction through Virtual Reality (VR) videos; completed an AX-CPT (continuous performance test) task to measure sustained attention, for which task performance is evaluated from two aspects, task accuracy and task reaction times; and reported their flow states. EEG and PPG data were collected throughout the sessions, as supporting evidence for sustained attention. Our findings suggest that emotional valence and arousal significantly influence task reaction times and sustained attention, when gender differences are accounted for, but do not significantly impact task accuracy. Specifically, males responded faster under high-arousal negative emotions, while females responded faster under high-arousal positive emotions. Additionally, we find that flow experience is not significantly impacted by emotions states or sustained attention. These results underscore the nuanced interplay between emotion, sustained attention, and task performance, and validate the use of VR, EEG, and PPG technologies in future research on related topics.

    Keywords: sustained attention, emotion, virtual reality, Electroencephalogram, Photoplethysmography

    Received: 23 Jul 2024; Accepted: 30 Sep 2024.

    Copyright: © 2024 Yang, Zheng, Li and Tian. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Xuetao Tian, Beijing Normal University, Beijing, 100875, Beijing Municipality, China

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.