Audiovisual speech perception: Examining the McGurk illusion by fMRI at 7 Tesla

Gregor Rafael Szycik, Jörg Stadler, Thomas F Münte
Poster
Last modified: 2008-05-13

Abstract


In natural communication speech perception is profoundly influenced by observable mouth movements. This additional visual information facilitates considerable the intelligibility. Furthermore audiovisual (AV) incongruent (auditory stream does not match the articulatory movements) artificial speech may lead to novel percepts that neither match the auditory nor the visual information as evidenced by the McGurk effect. The recent “hypothesize-and-test� model of AV speech perception accentuates the role of both speech motor areas and the integrative brain sites in the vicinity of superior temporal sulcus (STS). In this event related 7 Tesla fMRI study we used three naturally spoken syllable pairs (BA, GA, DA) with matching AV information and one syllable pair designed to elicit the McGurk illusion (mcDA). The data were analysed by calculating linear contrasts including the comparison of mcDA vs. DA respectively BA and GA. Illusory syllables elicited grater brain activity than naturally spoken syllables. Furthermore, there were hemispheric differences in functional organisation of STS: the left STS showed two clusters processing auditory respective visual differences, whereas the right STS harboured both of these functions in one cluster. Our data supports and extends the model by showing hemispheric differences in processing AV speech.

Conference System by Open Conference Systems & MohSho Interactive Multimedia