Visual predictions influence speed and size of auditory responses: an EEG study

Tim Paris, Jeesun Kim, Chris Davis

Last modified: 2013-05-05

Abstract


Dynamic stimuli often provide predictive information about upcoming events occurring in a different modality. From a predictive coding framework, such expected events would require less processing than unexpected ones. We tested this idea using neural and behavioural measures of auditory processing in a vision-predicts-sound paradigm.
In the experiment, participants were presented with a visual stimulus followed by either a low or high tone while EEG was recorded from a 256 channel active system. The visual stimulus was either a picture (no prediction), a movie cuing the time of an upcoming auditory event (Event prediction) or a movie cueing the type of tone (Tone prediction). In one third of trials, interspersed throughout the experiment, participants also responded to the tone with a button press.
In both conditions, participants responded faster to the predicted than the unpredicted sounds. ERP results showed that visual Event predictions resulted in significantly smaller P2 ERP amplitude. Conversely, Tone predictions resulted in greater P2 ERP amplitude. The results will be discussed in terms of a predictive coding account.

Keywords


prediction; audiovisual; ERP; context

Conference System by Open Conference Systems & MohSho Interactive