Skip to main content

ORIGINAL RESEARCH article

Front. Hum. Neurosci.
Sec. Cognitive Neuroscience
Volume 18 - 2024 | doi: 10.3389/fnhum.2024.1483024

Neural speech tracking and auditory attention decoding in everyday life

Provisionally accepted
  • 1 University of Oldenburg, Oldenburg, Germany
  • 2 Sonova Consumer Hearing GmbH, Wedemark, Germany

The final, formatted version of the article will be published soon.

    In our complex world, the auditory system plays a crucial role in perceiving and processing our environment. Humans are able to segment and stream concurrent auditory objects, allowing them to focus on specific sounds, such as speech, and suppress irrelevant auditory objects. The attentional enhancement or suppression of sound processing is evident in neural data through a phenomenon called neural speech tracking. Previous studies have identified correlates of neural speech tracking in electroencephalography (EEG) data, but EEG measures are susceptible to motion artefacts, and the association between neural data and auditory objects is vulnerable to distraction. The current study investigated EEG-based auditory attention decoding in realistic everyday scenarios. N=20 participants were exposed to the sound of a busy cafeteria or walked along busy and quiet streets while listening to one or two simultaneous speech streams. We also investigated the robustness of neural speech tracking estimates within subjects. Linear decoding models were used to determine the magnitude of neural speech tracking. The results confirmed that neural speech tracking was strongest in single speaker scenarios. In dual speaker conditions, there was significantly stronger neural speech tracking for the attended speaker compared to the ignored speaker, even in complex environments such as a busy cafeteria or outdoor settings. In conclusion, EEG-based attention decoding is feasible in highly complex and realistic everyday conditions while humans behave naturally.

    Keywords: mobile EEG, speech envelope tracking, Auditory attention decoding, distraction, Movement

    Received: 19 Aug 2024; Accepted: 23 Oct 2024.

    Copyright: © 2024 Straetmans-Oehme, Adiloglu and Debener. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Lisa Straetmans-Oehme, University of Oldenburg, Oldenburg, Germany

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.