Audio-visual integration for objects, location and low-level dynamic stimuli: novel insights from studying sensory substitution and topographical mapping.

Amir Amedi, William Stern, Lotfi Merabet, Ella Striem, Uri Hertz, Peter Meijer, Alvaro Pascual-Leone
Symposium Talk
Last modified: 2008-05-13

Abstract


The talk will present fMRI and behavioral experiments of auditory-visual integration in humans. It will focus on integration in sighted but also in sight restoration set-up, looking into the effects of learning, brain development and brain plasticity. New findings regarding the nature of sensory representations for dynamic stimuli ranging from pure tones to complex, natural object sounds will be presented. I will highlight the use of sensory substitution devices (SSDs) in the context of blindness. In SSDs, visual information captured by an artificial receptor is delivered to the brain using non-visual sensory information. Using an auditory-to-visual SSD called "The vOICe" we find that blind achieve successful performance on object recognition tasks, and specific recruitment of ventral and dorsal 'visual' structures. Comparable recruitment was observed also in sighted learning to use this device but not in sighted learning arbitrary associations between sounds and object identity. We also find using phase locking Fourier Techniques an array of topographic maps which can serve as a basis for such audio-visual integration. Finally, these results suggest "The vOICe" can be useful for blind individuals' daily activities but it also has a potential use to ‘guide’ visual cortex to interpret visual information arriving from prosthesis.

Conference System by Open Conference Systems & MohSho Interactive Multimedia