Transferrable Learning of Multisensory Cues in Flight Simulation
Georg F Meyer, Li Wong, Emma Timson, Philip Perfect, Mark White

Last modified: 2011-08-24

Abstract


Flight simulators, which and provide visual, auditory and kinematic (physical motion) cues are increasingly used for pilot training. We have previously shown that kinematic cues, but not auditory cues representing aircraft motion improve target tracking performance for novice ‘pilots’ in a simulated flying task (Meyer et al., IMRF 2010).
Here we explore the effect of learning on task performance. Our subjects were first tested on a target tracking task in a helicopter flight simulation. They were then trained in a simulator-simulator, which provided full audio, simplyfied visuals, but not kinematic signals to test whether learning of auditory cues is possible. After training we evaluated flight performance in the full simulator again.
We show that after two hours training auditory cues are used by our participants as efficiently as kinematic cues to improve target tracking performance. The performance improvement relative to a condition where no audio signals are presented is robust if the sound environment used during training is replaced by a very different audio signal that is modulated in amplitude and pitch in the same was as the training signal. This shows that training is not signal specific but that our participants learn to extract transferrable information on sound pitch and amplitude to improve their flying performance.

Conference System by Open Conference Systems & MohSho Interactive