Visual anticipatory information modulates audiovisual cross-modal interactions of artificial stimuli
Jean Vroomen, Jeroen Stekelenburg
Poster
Last modified: 2008-05-09
Abstract
The neural activity of speech sound processing (the N1) can be suppressed if a speech sound is accompanied by concordant lip movements. Here we demonstrate that this audiovisual interaction is not speech-specific or linked to human-like actions, but it can be observed with artificial stimuli as well. When a pure tone was accompanied by the squeeze of a rectangle, there was an N1-suppression if the temporal occurrence of this audiovisual event was made predictable by two moving disks that touched the rectangle. There was no N1-reduction by synchronized visual information that did not precede sound onset. The N1-reduction was also abolished if the delay between the audiovisual event and the disks varied from trial-to-trial. These results demonstrate that the N1-suppression is induced by visual information that precedes and reliably predicts sound onset, without a necessary link to human action–related neural mechanisms.