Multisensory interactions facilitate categorical discrimination of objects

Celine Cappe, Micah M. Murray
Poster
Time: 2009-07-02  09:00 AM – 10:30 AM
Last modified: 2009-06-04

Abstract


The present study investigated the extent to which the discrimination of everyday objects is affected under multisensory conditions. Recent evidence would suggest that visual articulator information can speed up auditory speech processing (van Wassenhove et al., 2005 PNAS), though it should be noted that in this study and more generally in the case of speech the visual component often precedes its auditory counterpart. Research investigating integration of synchronously presented auditory-visual object stimuli has focused instead on effects of attention and/or has limited the stimulus set to animals (Molholm et al., 2004 Cereb Cortex; Yuval-Greenberg and Deouell, 2007 J Neurosci). It thus remains unknown whether categorical discrimination of environmental objects, and by extension object recognition, benefits from multisensory stimulation either at a behavioral or neurophysiologic level. We focused here on the categories of living and man-made objects given previous research demonstrating these to engage (partially) dissociable brain networks (e.g. Murray et al., 2006 J Neurosci for the case of sounds; Gerlach, 2007 J Cogn Neurosci for the case of images). Participants were presented with auditory, visual, or simultaneous auditory-visual (AV) stimuli during a living vs. man-made discrimination task. The auditory stimuli were those used in our prior works and were controlled both spectro-temporally and psychophysically. The visual stimuli were derived from controlled image sets (Snodgrass and van der Waart, 1980 J Exp Psych). Reaction times (RTs) were submitted to a 2 x 3 repeated measures ANOVA using within subject factors of category (living vs man-made) and sensory modality (A, V, AV). While RTs were generally slower for auditory than either visual or multisensory conditions, there was no evidence that RTs differed between visual and multisensory conditions. Likewise, there was no difference between RTs for each category. Thus, there was no support for multisensory facilitation of behavior. By contrast, our ongoing electrical neuroimaging analyses revealed there to be facilitated discrimination of object categories when subjects were presented with multisensory versus either unisensory condition. Both auditory and visual conditions exhibited topographic differences between living and man-made object categories at ~140ms, indicative of configuration changes in the intracranial sources active in response to these object categories when presented visually or acoustically. However, following multisensory stimulation, this differential effect occurred ~20ms earlier. While not necessarily facilitating the earliest stages of categorical discrimination, these results nonetheless suggest that object recognition processes in vision and audition interact and can facilitate one another under multisensory conditions.

Conference System by Open Conference Systems & MohSho Interactive Multimedia