Skip to main content

BRIEF RESEARCH REPORT article

Front. Neurosci.

Sec. Auditory Cognitive Neuroscience

Volume 19 - 2025 | doi: 10.3389/fnins.2025.1535759

This article is part of the Research Topic Crossing Sensory Boundaries: Multisensory Perception Through the Lens of Audition View all 10 articles

Visual nudging of navigation strategies improves frequency discrimination during auditory-guided locomotion

Provisionally accepted
  • 1 Ludwig Maximilian University of Munich, Munich, Bavaria, Germany
  • 2 Technical University of Munich, Munich, Bavaria, Germany

The final, formatted version of the article will be published soon.

    Perception in natural environments requires integrating multisensory inputs while navigating our surroundings. During locomotion, sensory cues such as vision and audition change coherently, providing crucial environmental information. This integration may affect perceptual thresholds due to sensory interference. Vision often dominates in multimodal contexts, overshadowing auditory information and potentially degrading audition. While traditional laboratory experiments offer controlled insights into sensory integration, they often fail to replicate the dynamic, multisensory interactions of real-world behavior.We used a naturalistic paradigm in which participants navigate an arena searching for a target guided by position-dependent auditory cues. Previous findings showed that frequency discrimination thresholds during self-motion matched those in stationary paradigms, even though participants often relied on visually dominated navigation instead of auditory feedback. This suggested that vision might affect auditory perceptual thresholds in naturalistic settings.Here, we manipulated visual input to examine its effect on frequency discrimination and search strategy selection. By degrading visual input, we nudged participants’ attention toward audition, leveraging subtle sensory adjustments to promote adaptive use of auditory cues without restricting their freedom of choice. Thus, this approach explores how attentional shifts influence multisensory integration during self-motion.Our results show that frequency discrimination thresholds improved by restricting visual input, suggesting that reducing visual interference can increase auditory sensitivity. This is consistent with adaptive behavioral theories, suggesting that individuals can dynamically adjust their perceptual strategies to leverage the most reliable sensory inputs. These findings contribute to a better understanding of multisensory integration, highlighting the flexibility of sensory systems in complex environments.

    Keywords: Auditory Perception, Audiovisual interaction, active sensing, sensory nudging, multimodal integration, Sith, naturalistic experimental design, Self-movement

    Received: 27 Nov 2024; Accepted: 07 Mar 2025.

    Copyright: © 2025 Malzacher, Hilbig, Pecka and Ferreiro. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Dardo N Ferreiro, Ludwig Maximilian University of Munich, Munich, 80539, Bavaria, Germany

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

    Research integrity at Frontiers

    Man ultramarathon runner in the mountains he trains at sunset

    94% of researchers rate our articles as excellent or good

    Learn more about the work of our research integrity team to safeguard the quality of each article we publish.


    Find out more