How the blind “See” Braille and the deaf “hear” sign: Lessons from fMRI on the cross-modal plasticity, integration, and learning
Norihiro Sadato

Last modified: 2011-10-07

Abstract


What does the visual cortex of the blind do during Braille reading? This process involves converting simple tactile information into meaningful patterns that have lexical and semantic properties. The perceptual processing of Braille might be mediated by the somatosensory system, whereas visual letter identity is accomplished within the visual system in sighted people. Recent advances in functional neuroimaging techniques have enabled exploration of the neural substrates of Braille reading (Sadato et al. 1996, 1998, 2002, Cohen et al. 1997, 1999). The primary visual cortex of early-onset blind subjects is functionally relevant to Braille reading, suggesting that the brain shows remarkable plasticity that potentially permits the additional processing of tactile information in the visual cortical areas. Similar cross-modal plasticity is observed by the auditory deprivation: Sign language activates the auditory cortex of deaf subjects (Neville et al. 1999, Nishimura et al. 1999, Sadato et al. 2004).
Cross-modal activation can be seen in the sighted and hearing subjects. For example, the tactile shape discrimination of two dimensional (2D) shapes (Mah-Jong tiles) activated the visual cortex by expert players (Saito et al. 2006), and the lip-reading (visual phonetics) (Sadato et al. 2004) or key touch reading by pianists (Hasegawa et al. 2004) activates the auditory cortex of hearing subjects. Thus the cross-modal plasticity by sensory deprivation and cross-modal integration through the learning may share their neural substrates.
To clarify the distribution of the neural substrates and their dynamics during cross-modal association learning within several hours, we conducted audio-visual paired association learning of delayed-matching-to-sample type tasks (Tanabe et al. 2005). Each trial consisted of the successive presentation of a pair of stimuli. Subjects had to find pre-defined audio-visual or visuo-visual pairs in a trial and error manner with feedback in each trial. During the delay period, MRI signal of unimodal and polymodal areas increased as cross-modal association learning proceeded, suggesting that cross-modal associations might be formed by binding unimodal sensory areas via polymodal regions.
These studies showed that sensory deprivation and long- and short-term learning dynamically modify the brain organization for the multisensory integration.

References


Cohen, L. G., Celnik, P., Pascual-Leone, A., Corwell, B., Falz, L., Dambrosia, J., Honda, M., Sadato, N., Gerloff, C., Catala, M. D. & Hallett, M. (1997) Functional relevance of cross-modal plasticity in blind humans. Nature 389(6647): 180-183.
Cohen, L. G., Weeks, R. A., Sadato, N., Celnik, P., Ishii, K. & Hallett, M. (1999) Period of susceptibility for cross-modal plasticity in the blind. Ann Neurol 45(4): 451-460.
Hasegawa, T., Matsuki, K., Ueno, T., Maeda, Y., Matsue, Y., Konishi, Y. & Sadato, N. (2004) Learned audio-visual cross-modal associations in observed piano playing activate the left planum temporale. An fMRI study. Brain Res Cogn Brain Res 20(3): 510-518.
Neville, H. J., Bavelier, D., Corina, D., Rauschecker, J., Karni, A., Lalwani, A., Braun, A., Clark, V., Jezzard, P. & Turner, R. (1998) Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience. Proc Natl Acad Sci U S A 95(3): 922-929.
Nishimura, H., Hashikawa, K., Doi, K., Iwaki, T., Watanabe, Y., Kusuoka, H., Nishimura, T. & Kubo, T. (1999) Sign language 'heard' in the auditory cortex. Nature 397: 116.
Sadato, N., Pascual-Leone, A., Grafman, J., Ibanez, V., Deiber, M. P., Dold, G. & Hallett, M. (1996) Activation of the primary visual cortex by Braille reading in blind subjects. Nature 380(6574): 526-528.
Sadato, N., Pascual-Leone, A., Grafman, J., Deiber, M. P., Ibanez, V. & Hallett, M. (1998) Neural networks for Braille reading by the blind. Brain 121(Pt 7): 1213-1229.
Sadato, N., Okada, T., Honda, M., Matsuki, K. I., Yoshida, M., Kashikura, K. I., Takei, W., Sato, T., Kochiyama, T. & Yonekura, Y. (2004) Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf. Cereb Cortex.
Saito, D. N., Okada, T., Honda, M., Yonekura, Y. & Sadato, N. (2006) Practice makes perfect: the neural substrates of tactile discrimination by Mah-Jong experts include the primary visual cortex. BMC Neurosci 7: 79.
Tanabe, H. C., Honda, M. & Sadato, N. (2005) Functionally segregated neural substrates for arbitrary audiovisual paired-association learning. J Neurosci 25(27): 6409-6418.

Conference System by Open Conference Systems & MohSho Interactive