Temporal integration in sound localization via head rotation
Ewan Andrew Macpherson, Janet Kim

Date: 2012-06-22 09:30 AM – 11:00 AM
Last modified: 2012-04-24

Abstract


Information about a sound source's location in the front/back dimension is present in the relation between head rotation and the resulting changes in interaural time- or level-difference cues. The use of such dynamic cues for localization requires the auditory system to have access to an accurate representation of the orientation and motion of the head in space. We measured, in active and passive rotation conditions, and as a function of head-rotation angle and velocity, normally hearing human listeners' ability to localize front and rear sources of a low-frequency (0.5-1 kHz) noise band that was not accurately localizable in the absence of head motion. Targets were presented while the head was in motion at velocities of 50-400 deg/s (active neck rotation) or 25-100 deg/s (whole-body passive rotation), and were gated on and off as the head passed through a variable-width spatial window. Accuracy increased as window width was increased, which provided access to larger interaural cue changes, but decreased as head-turn velocity increased, which reduced the duration of the stimuli. For both active and passive rotation, these effects were almost exactly reciprocal, such that performance was related primarily to the duration of the stimulus, with ~100 ms duration required for 75% correct front/back discrimination regardless of the cue-change magnitude or mode of rotation. The efficacy of the dynamic auditory cues in the passive rotation condition suggests that vestibular input is sufficient to inform the auditory system about head motion.

Conference System by Open Conference Systems & MohSho Interactive