Behavioural investigations of audiotactile interactions in humans

Valeria Occelli
Poster
Time: 2009-06-30  09:00 AM – 10:30 AM
Last modified: 2009-06-04

Abstract


Despite the large number of functional similarities (e.g., Gescheider, 1970) as well as commonly shared neural substrates (e.g., Kayser, Petkov, Augath, & Logothetis, 2005), the interactions occurring between touch and hearing have so far been far less well explored than is the case for other sensory pairings. We present a series of studies aimed at investigating different aspects of the interactions between audition and touch.
A speeded discrimination task was used to explore the possible occurrence of an audiotactile Colavita effect (Colavita, 1974). The simultaneous auditory and tactile signals showed a weaker competition than the one occurring in audiovisual and visuotactile pairings, thus providing evidence of an audiotactile sensory balance (Occelli, Hartcher O’Brien, Spence, & Zampini, 2009)
In another series of studies, we explored sensory correspondences, such as those occurring between the frequency of a sound and the relative elevation of a tactile stimulus. Using a modified version of the Implicit Association Task (Greenwald, McGhee, & Schwartz, 1998), better performance was observed in the compatible (vs. incompatible) blocks, thus providing empirical support for the crossmodal association between the frequency of a sound and the relative elevation of a tactile stimulus (Occelli, Zampini, & Spence, 2009a).
Sensory correspondences based on frequency have also been explored by testing whether people are able to match the frequency of stimuli presented within the same sensory modality (i.e., audition or touch) or crossmodally (i.e., one stimulus presented to the skin and the other to the ear). Both the flutter frequency range (i.e., frequencies <50 Hz) and the vibration frequency range (i.e., frequencies >50 Hz) were tested. The results showed that the matching of the stimuli when both are presented in audition is significantly more accurate than the other two conditions. The accuracy in discriminating the tactile frequency differences is higher within the flutter (vs. vibration) range. Despite the difficulty in matching the auditory and tactile frequencies, performance was modulated by the magnitude of the discrepancy in the frequency pattern (standard vs. comparison stimuli) for stimuli presented in the flutter range. Thus, people can match frequencies presented crossmodally, provided that the frequencies are presented in the flutter (vs. vibration) range.
In the audiotactile motion domain, we have shown that the change of the physical properties of sound (i.e., the intensity and the typology) can differentially affect the audiotactile capture effect in a ‘crossmodal dynamic capture task’. The capture effect of the 82 dB auditory distractors was significantly more pronounced than the one induced by the 75 dB auditory distractor, whereas performance in reporting the direction of the auditory motion was overall significantly better if the stimuli were presented at 82 dB vs. 75 dB (Occelli, Zampini, & Spence, 2009b). Moreover, while the crossmodal dynamic capture exerted by tactile distractors was comparable for either auditory stimuli consisting of white noise or pure tones, the capture effect induced by the white noise stimuli was significantly larger than that induced by the pure tones (Occelli, Spence, & Zampini, 2009c).
Another aspect of particular interest is the investigation of how the visual experience can modulate these audiotactile interactions.In one study, we compared the temporal perception between blind and sighted participants in a temporal order judgment (TOJ) task, with auditory and tactile stimuli presented from same vs. different positions. Our results (Occelli, Spence, & Zampini, 2008) showed that the redundant spatial information provided by non-visual information in frontal space exerted a selective influence on the performance of blind participants while the spatial arrangement of the stimuli did not modulate the performance of the sighted controls.
In summary, the results emerging from these studies highlight the multiplicity of ways in which auditory and tactile stimuli can interact, possibly suggesting further aspects to be investigated in future research.

Conference System by Open Conference Systems & MohSho Interactive Multimedia