Perception under noisy conditions requires not only feature identification but also a process whereby target features are selected and noise is filtered out (e.g., when identifying an animal hiding in the savannah). Interestingly, previous perceptual learning studies demonstrated the utility of training feature representation (without noise) for improving discrimination under noisy conditions. Furthermore, learning to filter out noise also appears to transfer to other perceptual task under similar noisy conditions. However, such learning transfer effects were thus far demonstrated predominantly in simple stimuli. Here we sought to explore whether similar learning transfer can be observed with complex real-world stimuli.
We assessed the feature-to-noise transfer effect by using complex stimuli of human faces. We first examined participants' performance on a face-noise task following either training in the same task, or in a different face-feature task. Second, we assessed the transfer effect across different noise tasks defined by stimulus complexity, simple stimuli (Gabor) and complex stimuli (faces).
We found a clear learning transfer effect in the face-noise task following learning of face features. In contrast, we did not find transfer effect across the different noise tasks (from Gabor-noise to face-noise).
These results extend previous findings regarding transfer of feature learning to noisy conditions using real-life stimuli.