Integration of Cued Speech with residual hearing
Jintao Jiang
Poster
Time: 2009-07-01 09:00 AM – 10:30 AM
Last modified: 2009-06-04
Abstract
Cued Speech is a manual system that was designed to disambiguate visible speech information and thus afford full speech information to a person with a hearing loss. Cued Speech has been shown to contribute significantly to the acquisition of language and reading skills in deaf children through early and extensive exposure to it. In the present study, the potential use of Cued Speech as a multisensory tool for auditory rehabitation for adult cochlear implant recipients was examined through a simulation of residual hearing using vocoded speech. A specific question was whether Cued Speech can be integrated with adult cochlear implant recipients’ residual hearing and lipreading. Towards this end, a computer-based Cued Speech English synthesis system was developed that used concatenative methods to synthesize visible hand cues. The synthesized visible hand cues were overlaid onto naturally recorded visible speech. Speech materials consisted of 260 consonant-vowel-consonant-vowel-consonant nonsense words. This set of words was developed through a Monte Carlo method so that they had similar characteristics to real words in English. These words were realized with audiovisual speech and audiovisual speech plus synthesized visible hand cues. Twelve adult participants with normal hearing were trained on Cued Speech using 120 words with visible speech plus synthesized/natural visible hand cues. After about 10 to 15 hours of training, a first test examined whether these participants perceptually process both visible speech and visible hand cues. A set of 120 nonsense words, different from those used in the training, was presented in an open-set identification task, for which participants entered what was said using a computer keyboard. Within the test words, there were 30 words with visible speech only and 30 words with visible speech plus synthesized visible hand cues. Results show that phonemes correct scores with visible speech plus synthesized visible hand cues were higher than those with the visible speech only stimuli. This indicates that participants can learn Cued Speech and integrate visible hand cues with their lipreading. To simulate the usage of Cued Speech by adults with cochlear implants, these participants with normal hearing also performed an open-set identification task with vocoded speech. A two-channel vocoding was used to simulate a situation that adults with cochlear implants receive very limited auditory information from their implants. Specifically, the speech signals were band-pass filtered into two frequency bands, the extracted temporal envelope in each frequency band was used to modulate a sinewave carrier, and the modulated bands were then summed. The same set of 120 test words was used after randomization. Within the test words, there were 30 words with vocoded speech only, 30 words with vocoded speech plus visible speech, and 30 words with vocoded speech plus visible speech plus synthesized visible hand cues. Results show that synthesized visible hand cues improved vocoded audiovisual speech perception. An additional test was performed in the end to demonstrate that the improvement of vocoded audiovisual speech perception with visible hand cues was not solely from visible speech plus synthesized visible hand cues or from vocoded speech plus synthesized visible hand cues, but instead, was from vocoded speech plus visible speech plus synthesized visible hand cues. Furthermore, correlation analyses of confusion matrices indicate that the perception of vocoded speech only, visible speech only, and visible hand cues only accounted for 73%, 63%, and 51% of the variance in the perception of vocoded speech plus visible speech plus visible hand cues, respectively. In summary, the present study demonstrated that after training on Cued Speech, adults with normal hearing can integrate their perception of visible hand cues with their perception of visible speech and simulated “residual hearing,� achieving three-modality integration. An implication is that Cued Speech can provide adults with cochlear implants an integrative but not distracting source of information and thus can be a potential multisensory tool for auditory rehabitation. [Work supported by an NIH/NIDCD Grant R03DC007976.]