Multisensory landmarks improve route memory performance in humans : a virtual reality study

Alexandre LEHMANN, LPPA - College de France

Abstract
Real world perception rarely involves separate sensory modalities. When remembering one's way in an unknown city we make a simultaneous use of several cognitive strategies. One of them is to remember chosen encountered landmarks and associate them with egocentric actions (such as ''turn right at the fountain''). Navigation studies in humans have rarely focused on multisensory simulated environments.

We investigated the extent to which non-visual cues (namely auditory cues) from environmental landmarks could modulate a spatial route memory performance. Subjects navigated in a maze in which landmarks were made the only available source of route information. These landmarks could be either purely visual or audio-visual. We hypothesised that extra auditory cues would lead to an increase in route recalling performance, and more specifically, that the added value of such cues lies in the temporal organisation of the landmark sequence.

We found that response latencies were significantly modulated in the presence of audiovisual landmarks whereas purely visual landmark showed no effect. We conclude that auditory information from landmarks does improve the navigation performance. And that it also leads to the use of a different, sequence-based, spatial encoding strategy, that we consider as reflecting a more ecological process at stake.

Not available

Back to Abstract