Multisensory contributions to visual motion parsing
Salvador Soto-Faraco

Last modified: 2011-08-22

Abstract


In humans, as well as most animal species, perception of object
motion is critical to successful interaction with the surrounding environment. Yet, as the observer moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have
demonstrated that observers use a flow parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or
backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that the parsing object motion from self-motion induced optic flow can operate on multisensory object representations.

Conference System by Open Conference Systems & MohSho Interactive