Early Auditory sensory processing is facilitated by visual mechanisms
Sonja Schall, Stefan J. Kiebel, Burkhard Maess, Katharina von Kriegstein

Date: 2012-06-21 01:30 PM – 03:00 PM
Last modified: 2012-04-30

Abstract


Abstract

There is compelling evidence that low-level sensory areas are sensitive to more than one modality. For example, auditory cortices respond to visual-only stimuli [1-3] and conversely, visual sensory areas respond to sound sources even in auditory-only conditions [4-6] . Currently, it is unknown what makes the brain activate modality-specific, sensory areas solely in response to input of a different modality. One reason may be that such activations are instrumental for early sensory processing of the input modality – a hypothesis that is contrary to current text book knowledge. Here we test this hypothesis by harnessing a temporally highly resolved method, i.e. magnetoencephalography (MEG), to identify the temporal response profile of visual regions in response to auditory-only voice recognition. Participants (n=19) briefly learned a set of voices audio-visually, i.e. together with a talking face in an ecologically valid situation, as in daily life. Once subjects were able to recognize these now familiar voices, we measured their brain responses using MEG. The results revealed two key mechanisms that characterize the sensory processing of familiar speakers’ voices: (i) activation in the visual face-sensitive fusiform gyrus at very early auditory processing stages, i.e. only 100 ms after auditory onset and (ii) a temporal facilitation of auditory processing (M200) that was directly associated with improved recognition performance. These findings suggest that visual areas are instrumental already during very early auditory-only processing stages and indicate that the brain uses visual mechanisms to optimize sensory processing and recognition of auditory stimuli.

References


1. Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, et al. (1997) Activation of auditory cortex during silent lipreading. Science 276: 593-596. 2. Pekkola J, Ojanen V, Autti T, Jaaskelainen IP, Mottonen R, et al. (2005) Primary auditory cortex activation by visual speech: an fMRI study at 3 T. Neuroreport 16: 125-128. 3. Meyer K, Kaplan JT, Essex R, Webber C, Damasio H, et al. (2010) Predicting visual stimuli on the basis of activity in auditory cortices. Nat Neurosci 13: 667-668. 4. von Kriegstein K, Giraud AL (2006) Implicit multisensory associations influence voice recognition. PLoS Biol 4: e326. 5. von Kriegstein K, Dogan O, Gruter M, Giraud AL, Kell CA, et al. (2008) Simulation of talking faces in the human brain improves auditory speech recognition. Proc Natl Acad Sci U S A 105: 6747-6752. 6. Poirier C, Collignon O, Devolder AG, Renier L, Vanlierde A, et al. (2005) Specific activation of the V5 brain area by auditory motion processing: an fMRI study. Brain Res Cogn Brain Res 25: 650-658.

Conference System by Open Conference Systems & MohSho Interactive