INVESTIGATION OF EVENT-RELATED BRAIN POTENTIALS OF AUDIO-VISUAL SPEECH PERCEPTION IN BACKGROUND NOISE
Axel H Winneke, Natalie A Phillips
Poster
Last modified: 2008-05-09
Abstract
We investigated event-related potentials (ERPs) to audio-visual (AV) speech in background babble noise. Participants (N=7) randomly perceived single spoken words presented in auditory-alone (A) and visual-alone (V) trials (i.e. lip-reading); and in a combined AV modality. ERPs were recorded to the onset of the mouth movement and/ or sound of spoken object names that participants categorized as natural (e.g., tree) or artificial (e.g., bike). Compared to A- and V-alone trials responses to AV trials were the fastest (p<.01) and most accurate (p<.01). This AV benefit was accompanied by a smaller amplitude of the auditory N1 ERP component at central sites relative to A-alone trials (p= .045) and also relative to the summed response of A and V-alone trials (p=.033). The data furthermore indicate a tendency of the N1 to peak slightly earlier during AV (~130ms) compared to A-alone trials (~145ms). The preliminary results imply that adding visual speech cues to auditory speech in a noisy environment enhances early auditory processing, possibly because the lips serve as a visual cue (i.e., attention and/ or complementary visemes) for the auditory system, which thus processes the speech signal more efficiently.