Event-related potentials reflect speech-relevant somatosensory-auditory interactions
Takayuki Ito, Vincent L. Gracco, David J. Ostry

Last modified: 2011-09-02

Abstract


An interaction between orofacial somatosensation and the perception of speech was demonstrated in recent psychophysical studies (Ito et al. 2009; Ito and Ostry 2009). To explore further the neural mechanisms of the speech-related somatosensory-auditory interaction, we assessed to what extent multisensory evoked potentials reflect multisensory interaction during speech perception. We also examined the dynamic modulation of multisensory integration resulting from relative timing differences between the onsets of the two sensory stimuli. We recorded event-related potentials from 64 scalp sites in response to somatosensory stimulation alone, auditory stimulation alone, and combined somatosensory and auditory stimulation. In the multisensory condition, the timing of the two stimuli was either simultaneous or offset by 90 ms (lead and lag). We found evidence of multisensory interaction with the amplitude of the multisensory evoked potential reliably different than the sum of the two unisensory potentials around the first peak of multisensory response (100 – 200 ms). The magnitude of the evoked potential difference varied as a function of the relative timing between the stimuli in the interval from 170 to 200 ms following somatosensory stimulation. The results demonstrate clear multisensory convergence and suggest a dynamic modulation of multisensory interaction during speech.

References


Ito T, Tiede M, Ostry DJ (2009) Somatosensory function in speech perception. Proc Natl Acad Sci U S A, 106:1245-8.

Ito T, Ostry DJ (2009), Perceptual modulation of facial skin sensation by auditory information from speech sounds, Annual meeting on Society for Neuroscience, Chicago, USA

Conference System by Open Conference Systems & MohSho Interactive