AUTHOR=Katz William F. , Mehta Sonya TITLE=Visual Feedback of Tongue Movement for Novel Speech Sound Learning JOURNAL=Frontiers in Human Neuroscience VOLUME=9 YEAR=2015 URL=https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2015.00612 DOI=10.3389/fnhum.2015.00612 ISSN=1662-5161 ABSTRACT=

Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known about the role of viewing one's own speech articulation processes during speech training. The current study investigated whether real-time, visual feedback for tongue movement can improve a speaker's learning of non-native speech sounds. An interactive 3D tongue visualization system based on electromagnetic articulography (EMA) was used in a speech training experiment. Native speakers of American English produced a novel speech sound (/ɖ/; a voiced, coronal, palatal stop) before, during, and after trials in which they viewed their own speech movements using the 3D model. Talkers' productions were evaluated using kinematic (tongue-tip spatial positioning) and acoustic (burst spectra) measures. The results indicated a rapid gain in accuracy associated with visual feedback training. The findings are discussed with respect to neural models for multimodal speech processing.