Perceptual learning suggests crossmodal plasticity in adult humans at relatively early levels of processing
Anton L. Beer, Melissa A. Batson, Takeo Watanabe
Talk
Last modified: 2008-05-13
Abstract
Sounds modulate the perception of spatially aligned visual patterns in normal humans and elicit neural activity in visual cortex of congenitally blind humans. However, the mechanisms underlying these crossmodal interactions are still unclear. We developed a novel perceptual learning paradigm that allowed us to detect crossmodal plasticity after repeated exposure to audio-visual stimuli. We found that peripheral sounds paired with spatially aligned moving dots induced persistent changes in visual motion discrimination. Auditory-guided visual plasticity was restricted to visual field locations that overlapped with the sound source and to the motion direction that was paired with the sound. This location- and feature-specificity suggests that crossmodal plasticity occured at relatively early levels of visual perception. We further tested whether the audio-visual maps underlying crossmodal interactions can be re-aligned in adult humans. Repeated exposure to mis-aligned audio-visual stimuli resulted in a shift of crossmodal maps. Crossmodal facilitation effects showed a relatively sharp spatial gradient whereas crossmodal inhibition effects had a relatively poor spatial resolution suggesting that facilitation involves earlier processing stages than crossmodal inhibition. Interestingly, facilitation effects were even more susceptible to perceptual learning than inhibition effects suggesting that multisensory plasticity in adults affects relatively early levels of multisensory integration.
