Human temporal coordination of visual and auditory events in virtual reality
Michiteru Kitazaki

Date: 2012-06-19 01:30 PM – 03:00 PM
Last modified: 2012-04-25

Abstract


Since the speed of sound is much slower than light, we sometimes hear a sound later than an accompanying light event (eg. thunder and lightning at a far distance). However, Sugita and Suzuki (Nature 2003) reported that our brain coordinates a sound and its accompanying light to be perceived simultaneously within 20 m distance. Thus, the light accompanied with physically delayed sound is perceived simultaneously with the sound in near field. We aimed to test if this sound-light coordination occurs in a virtual-reality environment and investigate effects of binocular disparity and motion parallax. Six naive participants observed visual stimuli on a 120-inch screen in a darkroom and heard auditory stimuli from a headphone. A ball was presented in a textured corridor and its distance from the participant was varied at 3-20 m. The ball changed to be in red before or after a short (10 ms) white noise (Time difference: -120, -60, -30, 0, +30, +60, +120 ms), and participants judged temporal order of the color-change and the sound. We varied visual depth cues (binocular disparity and motion parallax) in the virtual-reality environment, and measured the physical delay at which visual and auditory events were perceived simultaneously. In results, we did not find sound-light coordination without binocular disparity or motion parallax, but found it with both cues. These results suggest that binocular disparity and motion parallax are effective for sound-light coordination in virtual-reality environment, and richness of depth cues are important for the coordination.

References


Sugita, Y. & Suzuki, Y. (2003), Audiovisual perception: Implicit estimation of sound-arrival time, Nature, 421, 911.

Conference System by Open Conference Systems & MohSho Interactive