Dissociating pitch and loudness interactions between audition and touch
Last modified: 2013-05-05
Abstract
In this talk I review a series of studies focused on characterizing the relationship between audition and touch. We perceive the frequency and intensity of environmental oscillations (sounds and vibrations) using both modalities. Audio-tactile interactions in the frequency domain are frequency-specific and bi-directional: Interaction patterns support the existence of shared (supramodal) frequency representations. In contrast, audio-tactile interactions in the intensity domain reveal a separate set of integration rules. Thus, a pair of auditory and tactile inputs combines differently depending on the perceptual task (i.e., pitch vs. loudness discrimination). That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two processes rely on separate neural mechanisms. Other perceptual processes combining auditory and tactile signals, like stimulus detection or spatial localization, may also adhere to unique integration rules that reflect dissociable neural mechanisms. These results underscore the complexity and specificity of multisensory interactions.
Keywords
psychophysics; integration; attention