Recent research has reported that adding non-existent diacritical marks to a word produces a minimal reading cost compared to the intact word. Here we examined whether this minimal reading cost is due to: (1) the resilience of letter detectors to the perceptual noise (i.e., the cost should be small and comparable for words and nonwords) or (2) top-down lexical processes that normalize the percept for words (i.e., the cost would be larger for nonwords).
We designed a letter detection experiment in which a target stimulus (either a word or a nonword) was presented intact or with extra non-existent diacritics [e.g.,
Although the task involved lexical processing, with responses being faster and more accurate for words compared to nonwords, we found only a minimal advantage in error rates for intact stimuli versus those with non-existent diacritics. This advantage was similar for both words and nonwords.
The letter detectors in the word recognition system appear to be resilient to non-existent diacritics without the need for feedback from higher levels of processing.