There can be only one! Integrating vision and touch at different egocentric locations
Single Paper Presentation
Hannah Helbig
MPI for Biological Cybernetics
Marc Ernst
MPI for Biological Cybernetics Abstract ID Number: 59 Full text:
Not available Last modified:
March 15, 2006
Presentation date: 06/18/2006 4:00 PM in Hamilton Building, Foyer
(View Schedule)
Abstract
Ernst and Banks (2002) showed that humans integrate visual and haptic signals in a statistically optimal. Integration seems to be broken if there is a spatial discrepancy between the signals (Gepshtein et al., 2005).
Does knowledge that two signals belong to the same object facilitate integration even when they are presented at discrepant locations?
In our experiment, participants had to judge the shape of visual-haptic objects. In one condition, visual and haptic object information was presented at the same location, whereas in the other condition there was a spatial offset between the two information sources, however, subjects know that the signals belong together. In both conditions, we introduced a slight conflict between the visually and haptically perceived shape and asked participants to report the felt (seen) shape. If integration breaks due to the spatial discrepancy we expect subjects’ percept to be less biased by visual (haptic) information.
We found that in both conditions the shape percept was in-between the haptically and visually specified shapes and did not differ significantly. This finding suggests that multimodal signals are combined if observers have reason to assume that they belong to the same event, even when there is a spatial discrepancy.
|
 |
Learn more
about this
publishing
project...
|
|
|