How auditory information influences volitional control in binocular rivalry: Modulation of a top-down attentional effect
Manuel Vidal, Victor Barrès

Last modified: 2011-09-02

Abstract


Although multistable perception has long been studied, in recent years, paradigms involving ambiguous visual stimuli have been used to assess phenomenal awareness. Indeed, for such stimuli, although the sensory input is invariant, a perceptual decision occurs bringing to awareness only one of the possible interpretations. Moreover, the observer experiences inevitable oscillations between these percepts. Only few studies have tackled bistability for multimodal percepts. Conrad et al. [1] presented dot fields going in opposite direction to each eye, they found that lateral sound motion prolongs the dominance duration of the coherent visual motion percept and not the other. In this study, the volitional control – the capacity to promote a given percept – was not assessed. Van Ee et al. [2] tested how sensory congruency influences volitional control using looming motion stimuli that could be seen, heard or sensed on the hand. They found that attention to the additional modality (sound or touch) has to be engaged to promote visual dominance and that only temporal congruency is required in the end. In both studies such low-level stimuli could have reduced the possibility of finding passive interactions and limited the congruency to temporal features, which motivated the use of higher-level audio-visual speech processing in our experiments.
Accordingly, we used the McGurk effect [4] involving robust audio-visual integration to investigate multimodal rivalry. After measuring the perceptual oscillation dynamics of rivaling videos (lips pronouncing /aba/ vs. /aga/), we added the sound /aba/ in order to assess the influence of audio-visual integration on the perceptual dynamics. Adding the sound /aba/ increased the dominance durations of both percepts when viewed passively. For McGurk sensitive participants, it also increased the capacity to promote lips uttering /aba/ percept as compared to the same situation without sound; but not for lips uttering /aga/ although it could be equivalent to promoting lips uttering /ada/. Our findings suggest that at higher-level processing stages, auditory cues do interact with the perceptual decision and with the dominance mechanism involved during visual rivalry. These results are discussed according to the individual differences in the audio-visual integration for speech perception. We propose a descriptive model based on known characteristics of binocular rivalry, which accounts for most of these findings. In this model, the top-down attentional control (volition) is modulated by lower-level audio-visual matching.

References


[1] Hupe, J., Joffo, L. & Pressnitzer, D. (2008) Bistability for audiovisual stimuli: Perceptual decision is modality specific. J.Vis. 8, 1-15

[2] van Ee, R., van Boxtel, J.J.A., Parker, A.L. & Alais, D. (2009) Multisensory congruency as a mechanism for attentional control over perceptual selection. J. Neurosci 29, 11641-11649

[3] Conrad, Bartels, Kleiner & Noppeney (2010). Audiovisual interactions in binocular rivalry. J.Vis, 10(10):27, 1–15

[4] McGurk, H. & MacDonald, J. (1976) Hearing lips and seeing voices. Nature 264, 746-748


Conference System by Open Conference Systems & MohSho Interactive