Dynamic modulation of object processing stream during cross-modal integration
Single Paper Presentation
Lorina Naci
University of Cambridge
Abstract ID Number: 148 Full text:
Not available Last modified:
March 19, 2006
Presentation date: 06/19/2006 4:00 PM in Hamilton Building, Foyer
(View Schedule)
Abstract
It has been suggested that, during object processing, auditory and visual object features are analyzed within hierarchically structured sensory processing streams from sensory-specific cortex to superior/inferior temporal cortex, and are integrated in antero-medial temporal regions (Simmons & Barsalou, 2003; Taylor et al., 2005). This EEG study aimed to investigate the timing of cross-modal effects on auditory/visual sensory processes and on conceptual-semantic processes involved in object processing. High-density (128 channel) ERPs were recorded from fifteen healthy participants while performing a congruency task on auditory, visual, and audio-visual stimuli. The activations’ loci from an erFMRI study using the same stimuli and task [Taylor et al, 2005] were used to constrain the source analysis of the grand averaged ERPs. Cross-modal effects influence sensory processes from 60ms and conceptual-semantic processes between 150ms and 450ms. Auditory and visual sensory ERP components (P1, N1) are enhanced in the cross-modal condition. Cross-modal conceptual-semantic processes start at around 150ms and peak again between 400-450ms, times when the anterior-temporal cortex becomes significantly stronger than the posterior-occipital cortex. The later (400-500ms) effects are right-lateralized, as may be expected for the processing of non-linguistic stimuli. We discuss the implications of these findings with reference to hierarchical models of object processing.
|
|
Learn more
about this
publishing
project...
|
|
|