Building novel audio-visual objects from abstract auditory and visual stimuli
Oliver Doehrmann, Institute of Medical Psychology, Johann Wolfgang Goethe-University, Frankfurt am Main, Germany
Abstract
The aim of the present study was to investigate if it is possible to train novel semantic associations between abstract, “object-like” auditory and visual stimuli. Eleven subjects participated in a first fMRI experiment (PRE), a training session and a second fMRI session (POST). The fMRI experiments utilized pictures and sounds of animals or of abstract objects (“fribbles”; see http://www.cog.brown.edu/~tarr/stimuli.html). Moreover, in audio-visual (AV) conditions we varied the degree of semantic congruency between pictures and sounds.
Regions of the posterior superior temporal sulcus (pSTS), the middle temporal gyrus (MTG) and the precentral sulcus (PrCS) were involved in both experiments during AV-integration of natural stimuli. After training, AV-stimulation with abstract material was associated with even more pronounced activation in the same cortical regions and, additionally, in medial frontal and inferior-parietal regions. Moreover, novel incongruency effects for abstract AV-conditions were found in inferior and medial frontal regions as well as the right anterior insula. As these latter regions are assumed to be implicated in semantical processing of natural stimuli, we conclude that our training successfully established novel semantic associations.
Not available
Back to Abstract
|