This study investigates whether audiovisual phonetic training with hand gestures encoding visible or nonvisible articulation features has a differential impact on learning second language sounds. Ninety-nine Catalan–Spanish bilingual students were trained to differentiate English /æ/ and /ʌ/, which differ in the visible lip aperture and nonvisible tongue position, with training involving no gestures, gestures representing the lip aperture, or gestures representing the tongue position. Before, immediately ...
This study investigates whether audiovisual phonetic training with hand gestures encoding visible or nonvisible articulation features has a differential impact on learning second language sounds. Ninety-nine Catalan–Spanish bilingual students were trained to differentiate English /æ/ and /ʌ/, which differ in the visible lip aperture and nonvisible tongue position, with training involving no gestures, gestures representing the lip aperture, or gestures representing the tongue position. Before, immediately after, and 1 week after the training, participants’ perception of the targets was assessed through a word-identification task, and their production was tested through paragraph-reading, picture-naming, and word-imitation tasks. Although all participants improved in perception and production, the lip hand gesture was more effective in adjusting lip aperture than the other two conditions in the paragraph-reading and picture-naming tasks. These results suggest that hand gestures encoding visible rather than nonvisible articulation features are more effective for improving second language pronunciation.
+