Movement synchronisation to multisensory temporal cues
Mark T Elliott, Andrew E Welchman, Michail Doumas, Alan M Wing
Talk
Last modified: 2008-05-13
Abstract
Responding and synchronising movements to external events is a task we perform on a daily basis, whether it be tapping our foot along to a song or keeping in step with a dance partner. Often the brain has access to multiple sensory cues useful for timing actions (e.g., the auditory beat, flashing lights and touch of a dance partner), yet when presented independently, previous studies have shown that auditory cues dominate other modalities when controlling movement timing. Here we test a Maximum Likelihood Model (MLE) of multisensory combination for the temporal control of action. We asked participants to tap their index finger in time with a beat provided by a combinatory pair of auditory, haptic or visual metronomes. Furthermore, differing levels of noise were added to manipulate the reliability of individual metronomes. Synchronisation performance in multimodal settings was predominantly in accordance with that of an MLE combination of the component signals: metronome-tap asynchronies were intermediate between the components and variability was lower. Our results suggest that when timing actions coincident with an external event the brain combines all the available sensory information in a quasi-optimal way to minimise the synchronisation variability.