Are common consequences sufficient for visual-haptic integration?

Sascha Serwe, Konrad P Koerding, Julia Trommershäuser
Poster
Last modified: 2008-05-13

Abstract


Integration of information across sensory modalities helps to increase the accuracy of perceptual judgements. However, if two signals do not share a common cause, they should be recognized as independent signals and therefore be processed separately. The causal inference model (Körding & Tenenbaum, 2006) suggests a continuous transition from integration to separate processing which is influenced by the likelihood of a common cause. Here we asked whether subjects are able to integrate visual and haptic signals that do not share a common cause but a common consequence. We present an experiment where subjects performed goal-directed pointing movements. The position of the goal had to be inferred from both visual and haptic information, presented during movement execution. We varied the temporal distance between the signals to either facilitate or hinder the integration process. We expect an improvement in pointing accuracy if the information contained in both signals is integrated. In contrast to perceptual cue integration the signals are not intuitively belonging together but have to be integrated actively. Only 2 out of 6 subjects showed the expected increase in pointing accuracy under simultaneous presentation of visual and haptic information. Successive presentation performance was not worse than simultaneous presentation performance.

Conference System by Open Conference Systems & MohSho Interactive Multimedia