Real-world unisensory and multisensory processing: Perceptual and neural mechanisms in complex natural scene processing

21K
views
18
authors
4
articles
Cover image for research topic "Real-world unisensory and multisensory processing: Perceptual and neural mechanisms in complex natural scene processing"
Editors
3
Impact
Loading...
Original Research
27 October 2015
Complementary fMRI and EEG evidence for more efficient neural processing of rhythmic vs. unpredictably timed sounds
Nienke van Atteveldt
4 more and 
Charles Schroeder
Schematic representation of the stimulation design. (A) An exemplar run, consisting of eight blocks: four with rhythmic tones, four with random tones. (B) The timing of tones in the rhythmic 1.6 Hz (upper time scale) and random 1.6 Hz (lower scale) blocks. In rhythmic blocks, tones were presented at a regular SOA of 624 ms. In random blocks, tones were presented at a random onset time (interval 20–604 ms after trial onset), producing an irregular SOA between 40 and 1208 ms and a mean SOA of 624 ms. “Beeps” are standard tones (440 Hz) and “boops” target tones (5–10% lower in frequency). Note that the overall structure was the same in the 2.2 Hz driving rate condition, but the exact times were different: the SOA was 454 ms during rhythmic blocks, and between 20 and 434 ms during random blocks.

The brain’s fascinating ability to adapt its internal neural dynamics to the temporal structure of the sensory environment is becoming increasingly clear. It is thought to be metabolically beneficial to align ongoing oscillatory activity to the relevant inputs in a predictable stream, so that they will enter at optimal processing phases of the spontaneously occurring rhythmic excitability fluctuations. However, some contexts have a more predictable temporal structure than others. Here, we tested the hypothesis that the processing of rhythmic sounds is more efficient than the processing of irregularly timed sounds. To do this, we simultaneously measured functional magnetic resonance imaging (fMRI) and electro-encephalograms (EEG) while participants detected oddball target sounds in alternating blocks of rhythmic (e.g., with equal inter-stimulus intervals) or random (e.g., with randomly varied inter-stimulus intervals) tone sequences. Behaviorally, participants detected target sounds faster and more accurately when embedded in rhythmic streams. The fMRI response in the auditory cortex was stronger during random compared to random tone sequence processing. Simultaneously recorded N1 responses showed larger peak amplitudes and longer latencies for tones in the random (vs. the rhythmic) streams. These results reveal complementary evidence for more efficient neural and perceptual processing during temporally predictable sensory contexts.

4,562 views
14 citations
Original Research
05 August 2015

Correlated sensory inputs coursing along the individual sensory processing hierarchies arrive at multisensory convergence zones in cortex where inputs are processed in an integrative manner. The exact hierarchical level of multisensory convergence zones and the timing of their inputs are still under debate, although increasingly, evidence points to multisensory integration (MSI) at very early sensory processing levels. While MSI is said to be governed by stimulus properties including space, time, and magnitude, violations of these rules have been documented. The objective of the current study was to determine, both psychophysically and electrophysiologically, whether differential visual-somatosensory (VS) integration patterns exist for stimuli presented to the same versus opposite hemifields. Using high-density electrical mapping and complementary psychophysical data, we examined multisensory integrative processing for combinations of visual and somatosensory inputs presented to both left and right spatial locations. We assessed how early during sensory processing VS interactions were seen in the event-related potential and whether spatial alignment of the visual and somatosensory elements resulted in differential integration effects. Reaction times to all VS pairings were significantly faster than those to the unisensory conditions, regardless of spatial alignment, pointing to engagement of integrative multisensory processing in all conditions. In support, electrophysiological results revealed significant differences between multisensory simultaneous VS and summed V + S responses, regardless of the spatial alignment of the constituent inputs. Nonetheless, multisensory effects were earlier in the aligned conditions, and were found to be particularly robust in the case of right-sided inputs (beginning at just 55 ms). In contrast to previous work on audio-visual and audio-somatosensory inputs, the current work suggests a degree of spatial specificity to the earliest detectable multisensory integrative effects in response to VS pairings.

5,801 views
21 citations
Original Research
02 June 2015

In well-controlled laboratory experiments, researchers have found that humans can perceive delays between auditory and visual signals as short as 20 ms. Conversely, other experiments have shown that humans can tolerate audiovisual asynchrony that exceeds 200 ms. This seeming contradiction in human temporal sensitivity can be attributed to a number of factors such as experimental approaches and precedence of the asynchronous signals, along with the nature, duration, location, complexity and repetitiveness of the audiovisual stimuli, and even individual differences. In order to better understand how temporal integration of audiovisual events occurs in the real world, we need to close the gap between the experimental setting and the complex setting of everyday life. With this work, we aimed to contribute one brick to the bridge that will close this gap. We compared perceived synchrony for long-running and eventful audiovisual sequences to shorter sequences that contain a single audiovisual event, for three types of content: action, music, and speech. The resulting windows of temporal integration showed that participants were better at detecting asynchrony for the longer stimuli, possibly because the long-running sequences contain multiple corresponding events that offer audiovisual timing cues. Moreover, the points of subjective simultaneity differ between content types, suggesting that the nature of a visual scene could influence the temporal perception of events. An expected outcome from this type of experiment was the rich variation among participants' distributions and the derived points of subjective simultaneity. Hence, the designs of similar experiments call for more participants than traditional psychophysical studies. Heeding this caution, we conclude that existing theories on multisensory perception are ready to be tested on more natural and representative stimuli.

7,565 views
23 citations
Open for submission
Frontiers Logo

Frontiers in Human Neuroscience

Multisensory integration: unveiling the complexities of perception
Edited by Alessandro Bortolotti, Baingio Pinna, Nicola Di Stefano, Caterina Padulo
Deadline
02 June 2025
Submit a paper
Recommended Research Topics
Frontiers Logo

Frontiers in Psychology

Sub-and supra-second timing: brain, learning and development
Edited by Lihan Chen, Yan Bao, Marc Wittmann
98.2K
views
52
authors
18
articles
Frontiers Logo

Frontiers in Psychology

Optical Veiling Glare and Neural Processing: Spatial Partners in Vision
Edited by John J. McCann, Thomas J.T.P. Van Den Berg, Eli Peli, Alessandro Rizzi, Vassilios Vonikakis
30.3K
views
11
authors
4
articles
Frontiers Logo

Frontiers in Integrative Neuroscience

The Multisensory Plastic Brain along the Life Span: from Neural Basis to Clinical Applications
Edited by Nadia Bolognini, Sofia Allegra Crespi
100.2K
views
57
authors
18
articles
Frontiers Logo

Frontiers in Robotics and AI

Closing the Loop: From Human Behavior to Multisensory Robots
Edited by Pablo Vinicius Alves De Barros, Doreen Jirak, German I. Parisi, Jun Tani
65.1K
views
24
authors
5
articles
94.8K
views
118
authors
28
articles