- 1Department of Psychology, University of Jyvaskyla, Jyväskylä, Finland
- 2Institute of Psychology, University of Tartu, Tartu, Estonia
- 3Institute of Cognitive Neuroscience and Psychology, Research Centre for Natural Sciences (RCNS), Budapest, Hungary
Editorial on the Research Topic
Visual mismatch negativity (vMMN): A unique tool in investigating automatic processing
Since 1978, when mismatch negativity (MMN) was reported for the first time in the auditory modality (Näätänen et al., 1978), MMN has been used to investigate automatic change detection. The first reports of visual mismatch negativity (vMMN) were published ~20 years after the discovery of the auditory MMN (Tales et al., 1999). Thereafter, vMMN has been used to study basic visual feature processing (Czigler et al., 2002; Astikainen et al., 2008; Grimm et al., 2009) as well as complex visual information processing, mostly face information (Astikainen and Hietanen, 2009; Kreegipuu et al., 2013), in normative and clinical populations (for a review see, Kremláček et al., 2016).
Later, vMMN, as well as its auditory counterpart, has been linked to predictive coding theory (Stefanics et al., 2014), which states that the brain predicts future events based on the representations of previous sensory events (Friston, 2005). This is well in line with the original idea of Risto Näätänen that the MMN reflects a memory-comparison process between the sensory memory of the standards and the “deviant” sensory input (Näätänen, 1990). In predictive coding framework (v)MMN is suggested to reflect prediction error which occurs when the new sensory input does not fit the predicted model (Friston, 2005). Predictive coding is suggested to be aberrant in several psychiatric conditions (for a recent review, see Smith et al., 2021). Therefore, vMMN is an important tool to investigate also predictive mechanisms in neuropsychiatric conditions, and it may have the potential for clinical applications.
The present Research Topic includes seven articles: six articles report empirical investigations of automatic visual deviance detection, while one article reviews visual mismatch negativity (vMMN) findings. Each of these articles is an important contribution to developing a better understanding of automatic visual cognition.
Four studies applied faces as stimuli to elicit the vMMN. Zeng et al. utilized an oddball and an equiprobable stimulus condition to disentangle the effects of probability and low-level features of the stimuli on vMMN. The vMMN amplitude was affected by neither of them. The results thus associate the vMMN to face stimuli to the memory-comparison theory of the mismatch negativity (Näätänen et al., 2005).
Two articles studied the possible “coarse-to-fine” principle in face processing. The article by Wang et al. investigated configural (i.e., the distance between eyes and between mouth and nose) and featural (i.e., the shape of the eye and mouth) information processing in non-attended faces. They found that the vMMN emerged in the 200–360 ms latency for the configural, but not the featural, face information. This suggests that different mechanisms underlie automatic configural and featural face processing. Lacroix et al. investigated the effects of low and high spatial frequencies in faces on vMMN. The vMMN was larger for high spatial frequency than for low spatial frequency faces approximately at 300–400 ms post-stimulus latency, reflecting lower prediction error for low than high spatial frequencies. Also, P100 was found to be more sensitive to low spatial frequencies, while the face-sensitive N170 was more sensitive to high spatial frequencies. This pattern could be in line with the coarse-to-fine hypothesis.
Ford et al. investigated vMMN to neutral, happy, and sad faces in healthy adults. They found that the vMMN amplitude to happy faces tended to correlate with some interpersonal features associated with autism and schizophrenia. Higher levels of interpersonal difficulty were found in those people who had a larger amplitude of the vMMN to happy faces. The article contributes to the growing literature searching for brain activity markers of neuropsychiatric conditions.
The article by Cheng et al. applied Chinese characters as stimuli when investigating the visual processing of self-referential information. The names of participants' birthplaces and University cities were applied as rarely presented deviant stimuli in a series of standard stimuli representing irrelevant cities. The results demonstrated a vMMN, which was strongest in the left occipital–temporal region, to the birthplace, but not to the University city. These findings suggest that visually presented birthplace information can be automatically processed similarly to own face and name information that has been previously studied in the context of self-referential information.
Kurita et al. investigated whether visual change detection can affect access processing to visual awareness (APVA). vMMN amplitude and perceptual alternation in binocular rivalry were measured to the orientation deviant that was perceived unconsciously, and the facilitation of APVA by the deviant was estimated. The relationship between vMMN and APVA was investigated by means of inter-individual variability and the correlation between vMMN enhancement and an increase in the proportion of perceptual alternation across participants. The results demonstrated that the unconsciously presented deviant stimulus made the unconscious stimulus more likely to reach consciousness. In addition, an association between the increased vMMN amplitude and the facilitation of perceptual alternation in binocular rivalry when an unconscious deviant was presented was found.
Czigler and Kojouharova reviewed 12 studies investigating vMMN in non-clinical populations (athletes, internet addicts), the effects of specific conditions (hypoxia, mental fatigue), and the sensitivity of vMMN to special stimuli, like artwork.
Altogether the Research Topic extends our understanding of preattentive visual processing, and it strongly contributes to face processing literature. The whole set of articles validates vMMN as an applicable tool to estimate the preattentive processing of visual stimuli whenever attentional processing or verbal reports are not accessible.
Author contributions
PA drafted the manuscript. KK and IC revised the text. All authors contributed to the article and approved the submitted version.
Funding
KK was supported by the Estonian Research Council grant (PRG1151). PA was supported by the Academy of Finland (Grant Number: 2100005537).
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Astikainen, P., and Hietanen, J.K. (2009). Event-related potentials to task-irrelevant changes in facial expressions. Behav. Brain Funct. 5, 30.
Astikainen, P., Lillstrang, E., and Ruusuvirta, T. (2008). Visual mismatch negativity for changes in orientation - a sensory memory-dependent response. Eur. J. Neurosci. 28, 2319–2324. doi: 10.1111/j.1460-9568.2008.06510.x
Czigler, I., Balazs, L., and Winkler, I. (2002). Memory-based detection of task-irrelevant visual changes. Psychophysiology 39, 869–873. doi: 10.1111/1469-8986.3960869
Friston, K. (2005). A theory of cortical responses. Philos. Trans. R. Soc. B 360, 815–836. doi: 10.1098/rstb.2005.1622
Grimm, S., Bendixen, A., Deouell, L. Y., and Schröger, E. (2009). Distraction in a visual multi-deviant paradigm: behavioral and event-related potential effects. Int. J. Psychophysiol. 72, 260–266. doi: 10.1016/j.ijpsycho.2009.01.005
Kreegipuu, K., Kuldkepp, N., Sibolt, O., Toom, M., Allik, J., and Näätänen, R. (2013). vMMN for schematic faces: automatic detection of change in emotional expression. Front. Hum. Neurosci. 213, 714. doi: 10.3389/fnhum.2013.00714
Kremláček, J., Kreegipuu, K., Tales, A., Astikainen, P., Poldver, N., Näätänen, R., et al. (2016). Visual mismatch negativity (vMMN): a review and meta-analysis of studies in psychiatric and neurological disorders. Cortex 80, 76–112. doi: 10.1016/j.cortex.2016.03.017
Näätänen, R. (1990). The role of attention in auditory information processing as revealed by event-related potentials and other brain measures of cognitive function. Behav. Brain Sci. 13, 201–288. doi: 10.1017/S0140525X00078407
Näätänen, R., Gaillard, A. W. K., and Mäntysalo, S. (1978). Early selective-attention effect on evoked potential reinterpreted. Acta Psychol. 42, 313–329. doi: 10.1016/0001-6918(78)90006-9
Näätänen, R., Jacobsen, T., and Winkler, I. (2005). Memory-based or afferent processes in mismatch negativity (MMN): A review of the evidence. Psychophysiology. 42, 25–32.
Smith, R., Badcock, P., and Friston, K. J. (2021). Recent advances in the application of predictive coding and active inference models within clinical neuroscience. Psychiatry Clin. Neurosci. 75, 3–13. doi: 10.1111/pcn.13138
Stefanics, G., Kremlácek, J., and Czigler, I. (2014). Visual mismatch negativity: a predictive coding view. Front. Hum. Neurosci. 8, 666. doi: 10.3389/fnhum.2014.00666
Keywords: visual mismatch negativity (visual MMN), event-related potentials, change detection, preattentive processing, visual system
Citation: Astikainen P, Kreegipuu K and Czigler I (2022) Editorial: Visual mismatch negativity (vMMN): A unique tool in investigating automatic processing. Front. Hum. Neurosci. 16:1056208. doi: 10.3389/fnhum.2022.1056208
Received: 28 September 2022; Accepted: 10 October 2022;
Published: 07 November 2022.
Edited by:
Jae Kun Shim, University of Maryland, United StatesReviewed by:
Karl Friston, University College London, United KingdomCopyright © 2022 Astikainen, Kreegipuu and Czigler. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Piia Astikainen, cGlpYS5hc3Rpa2FpbmVuJiN4MDAwNDA7anl1LmZp