Behaviour, language, and reasoning are expressions of brain functions par excellence; yet the brain can only draw on sensory modalities to gather information on the rest of the body and on the outer world. Traditionally, cortical areas processing the identity and location of sensory inputs were thought to be organised hierarchically, with certain branches processing basic features and other branches processing complex features. Thus, for example, visual inputs would initially go through lower-level visual areas and then through higher-level visual areas. Only at later stages does multisensory integration take place in the association zones, eventually ensuring conscious perception and recruitment of relevant muscles to execute complex motor plans.
Yet, this picture of brain functioning began to fade as evidence accumulated highlighting widespread ‘multisensory’ processing, with inputs from different senses becoming integrated prior to conscious perception. Current studies in multimodal emotion integration (e.g., face and voice) revealed synergistic effects at early sensory cortices as well as at higher-level association areas, which are responsible for cognitive evaluation of affective information. Similarly, perceptual learning in temporal discrimination was shown to readily transfer from one sensory modality to another. Further behavioural evidence suggests that complex events are interpreted via a continuous loop between intentions and sensory input such that, on the one hand, observers use sensory inputs to segment an event sequence into units, which in time become tied to knowledge about agents’ intentions and, on the other hand, hierarchical event schemas facilitate the perception of event structure, helping observers segment and organize their experiences.
A less hierarchical functional architecture of the brain has emerged such that, irrespective of sensory modality, inputs are allocated to the best suited cortical substrate. For example, predictions of the so-called ‘neural exploitation hypothesis’ that neural circuits initially used for a specific purpose (e.g., motor control) are being re-used for other purposes (e.g., language) have recently been confirmed with a twist. In particular, behavioural studies have provided evidence that language reflects specific characteristics of action organization in the perceptual and motor systems (e.g., chained organization) and that, in turn, language can modify these characteristics in important ways. Activation of grasp-related affordances, for instance, as when attention targets graspable parts of a perceived object, is amplified when following visual cues but not when following linguistic cues.
Our Research Topic welcomes contributions on multisensory integration and sensory adaptation encompassing all aspects of cognition, motion, and emotion.
Behaviour, language, and reasoning are expressions of brain functions par excellence; yet the brain can only draw on sensory modalities to gather information on the rest of the body and on the outer world. Traditionally, cortical areas processing the identity and location of sensory inputs were thought to be organised hierarchically, with certain branches processing basic features and other branches processing complex features. Thus, for example, visual inputs would initially go through lower-level visual areas and then through higher-level visual areas. Only at later stages does multisensory integration take place in the association zones, eventually ensuring conscious perception and recruitment of relevant muscles to execute complex motor plans.
Yet, this picture of brain functioning began to fade as evidence accumulated highlighting widespread ‘multisensory’ processing, with inputs from different senses becoming integrated prior to conscious perception. Current studies in multimodal emotion integration (e.g., face and voice) revealed synergistic effects at early sensory cortices as well as at higher-level association areas, which are responsible for cognitive evaluation of affective information. Similarly, perceptual learning in temporal discrimination was shown to readily transfer from one sensory modality to another. Further behavioural evidence suggests that complex events are interpreted via a continuous loop between intentions and sensory input such that, on the one hand, observers use sensory inputs to segment an event sequence into units, which in time become tied to knowledge about agents’ intentions and, on the other hand, hierarchical event schemas facilitate the perception of event structure, helping observers segment and organize their experiences.
A less hierarchical functional architecture of the brain has emerged such that, irrespective of sensory modality, inputs are allocated to the best suited cortical substrate. For example, predictions of the so-called ‘neural exploitation hypothesis’ that neural circuits initially used for a specific purpose (e.g., motor control) are being re-used for other purposes (e.g., language) have recently been confirmed with a twist. In particular, behavioural studies have provided evidence that language reflects specific characteristics of action organization in the perceptual and motor systems (e.g., chained organization) and that, in turn, language can modify these characteristics in important ways. Activation of grasp-related affordances, for instance, as when attention targets graspable parts of a perceived object, is amplified when following visual cues but not when following linguistic cues.
Our Research Topic welcomes contributions on multisensory integration and sensory adaptation encompassing all aspects of cognition, motion, and emotion.