Task-irrelevant spatial sounds affect haptic scene recognition

Jason S. Chan, Trinity College Dublin

Abstract
Previous research has found that performance in either haptic matching tasks (Newport et al., 2002) and haptic scene perception (Pasqualotto et al., 2007) improved when non-informative visual information was available, suggesting that vision provides the reference frame for spatial information in other modalities. Here we explored whether non-informative spatial sounds can affect haptic scene perception. Participants were blindfolded to minimize any external visual cues and their task was to first learn the spatial arrangement of an array of objects through touch and to subsequently indicate which two objects in the array had switched positions. When broadband noise was presented from the four corners of the room consecutively (Exp. 1) or from one loudspeaker placed directly in front of the participant (Exp. 2), these sounds did not affect performance relative to the ‘no sound’ condition. However, when we presented spatially distinct pure tones in each corner of the room in succession (Exp 3) we found an interference effect. Our findings suggest that although sound information does not carry the same spatial precision as vision to facilitate cross-modal spatial perception, sound can be sufficient to disrupt spatial processing in touch.

Not available

Back to Abstract