Touching the sound: High-frequency oscillations in a distributed cortical network reflect cross-modal semantic matching in haptic-to-auditory priming
Till R. Schneider, Simone Lorenz, Daniel Senkowski, Andreas K. Engel
Poster
Last modified: 2008-05-13
Abstract
When visual input is restricted like in darkness, we often rely on haptic and auditory information to recognize objects. While haptic object information is provided via the tactile and kinaesthetic senses stimulated by touching the objects, auditory object information is conveyed via the auditory system activated by sounds of objects. Here we examined in a cross-modal priming paradigm how active haptic exploration of natural objects affects the neural encoding of subsequently presented sounds of objects. Sounds were either semantically congruent or incongruent with the haptic objects. High-density electroencephalogram measurements and source reconstruction by means of linear beamforming were applied to examine neural responses in the gamma-band (30-100 Hz) to semantically congruent and incongruent stimuli. Shorter reaction times for semantically congruent compared to incongruent auditory inputs were observed, suggesting a haptic-to-auditory priming effect. Enhanced neural responses for semantically congruent compared to incongruent auditory inputs were observed in early evoked (50-100 ms) and long latency total (200-400 ms) gamma-band activity. These latter effects were projected to a network of multisensory convergence areas including left superior and middle temporal gyrus. Our results demonstrate that active haptic exploration primes auditory object recognition in modality independent semantic networks during multiple-stages of information processing.