AUTHOR=Invitto Sara , Calcagnì Antonio , Mignozzi Arianna , Scardino Rosanna , Piraino Giulia , Turchi Daniele , De Feudis Irio , Brunetti Antonio , Bevilacqua Vitoantonio , de Tommaso Marina TITLE=Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias JOURNAL=Frontiers in Behavioral Neuroscience VOLUME=11 YEAR=2017 URL=https://www.frontiersin.org/journals/behavioral-neuroscience/articles/10.3389/fnbeh.2017.00144 DOI=10.3389/fnbeh.2017.00144 ISSN=1662-5153 ABSTRACT=
Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.