Multisensory processing in synesthesia – Differences in the EEG signal during uni- and multimodal processing
Christopher Sinke, Janina Neufeld, Daniel Wiswede, Hinderk M Emrich, Stefan Bleich, Gregor R Szycik

Date: 2012-06-19 01:30 PM – 03:00 PM
Last modified: 2012-04-25

Abstract


Synesthesia is a condition in which stimulation in one processing stream (e.g. letters or music) leads to perception in an unstimulated processing stream (e.g. colors). Behavioral differences in mutisensory processing have been shown for multimodal illusions, but the differences in neural processing are still unclear. In the present study, we examined uni- and multimodal processing in 14 people with synesthesia and 13 controls using EEG recordings and a simple detection task. Stimuli were either presented acoustically, visually or multimodaly (simultaneous visual and auditory stimulation). In the multimodal condition, auditory and visual stimuli were either matching or mismatching (e.g. a lion either roaring or ringing). The subjects had to press a button as soon as something was presented visually or acoustically. Results: ERPs revealed occipital group differences in the negative amplitude between 100ms and 200ms after stimulus presentation. Relative to controls, synesthetes showed an increased negative component peaking around 150 ms. This group difference is found in all visual conditions. Unimodal acoustical stimulation leads to increased negative amplitude in synesthetes in the same time window over parietal and visual electrodes. Overall this shows that processing in the occipital lobe is different in synesthetes independent of the stimulated modality. In addition, differences in the negative amplitude between processing of incongruent and congruent multimodal stimuli could be detected in the same time window between synesthetes and controls over left frontal sites. This shows that also multimodal integration processes are different in synesthetes.

Conference System by Open Conference Systems & MohSho Interactive