Audiovisual Synchrony Perception for Complex Stimuli: How “Special” Is Speech?
Poster Presentation
Argiro Vatakis
Oxford University
Charles Spence
Oxford University Abstract ID Number: 40 Full text:
PDF
Last modified: June 24, 2005
Abstract
This study investigated the perception of synchrony for realistic audiovisual events. A series of speech, music, and object-action video clips were presented at a range of stimulus onset asynchronies (SOAs) using the method of constant stimuli. Participants made unspeeded temporal order judgments (TOJs) regarding which modality (audition or vision) appeared to have been presented first. The perception of synchrony for object-action and guitar music differed from that of piano music, but no differences were reported between speech and piano music. Speech and piano music both required the auditory stream to be presented before the visual stream for the perception of synchrony to be experienced, while object-action and guitar music events required vision to lead. These results suggest that speech is not special in terms of the visual lag normally required for the perception of simultaneity. Response accuracy differed marginally for all video clips presented. These results show that the window for audiovisual integration is wider (i.e., the Just Noticeable Difference or JND is higher) for more complex audiovisual events, than for the simple point-light and tone-burst stimuli typically used in the majority of previous research on multisensory temporal perception.
|
 |
Learn more
about this
publishing
project...
|
|
|