Investigating the Interplay of Time & Semantics during Multimodal Integration

Jean M Vettel, Adrian Nestor, Chris W. Bird, Laurie M. Heller, Tim Curran, Michael J. Tarr
Poster
Time: 2009-07-01  09:00 AM – 10:30 AM
Last modified: 2009-06-04

Abstract


Real world events often give rise to information across multiple perceptual modalities. Tapping a pencil provides correlated auditory and visual information to the senses. How are such events neurally represented? Prior behavioral and neural research on multimodal integration has identified the critical role of spatial, temporal, and semantic congruency between modalities as factors guiding the integration between modalities (Meredith & Stein, 1993; Doehrmann & Naumer, 2008). That is, multimodal information arising from a common physical cause shares a common temporal structure, the same spatial location, and has semantic associations based on context and prior experience. We investigate how two of these factors, temporal and semantic congruence, interact during multimodal integration. We studied these factors in two separate experiments, one employing event-related fMRI and one employing EEG. In both experiments, participants viewed and/or heard 2-second real-world environmental events that contained discrete impacts, such as splashing water or snapping twigs. These events were then edited to create 4 multimodal conditions in a 2x2 factorial design that crossed semantic congruence and temporal congruence.

Condition Semantically Congruent? Temporally Congruent?
1 SC:Yes TC:Yes (original movie)
2 SC:Yes TC:No
3 SC:No TC:Yes
4 SC:No TC:No

The fMRI study identified several brain regions showing a main effect of semantic congruence (i.e., Conditions 1 & 2 vs. Conditions 3 & 4) in the frontal cortex, including the left inferior frontal gyrus. The main effect of temporal congruence (Conditions 1 & 3 vs. Conditions 2 & 4) revealed a somewhat larger network of regions in the middle frontal, posterior temporal, and inferior occipital cortices. In addition, the EEG data provides more fine-grained information about the temporal dynamics of these effects – identifying when temporal and semantic congruency influence processing. The EEG data will further be used to guide a functional connectivity analysis of the fMRI data, providing a more informed account of how different brain regions interact during the processing of multimodal stimuli.

Conference System by Open Conference Systems & MohSho Interactive Multimedia