Body and gaze centered coding of touch locations during a dynamic task
Lisa Marie Pritchett, Michael J Carnevale, Laurence R Harris

Date: 2012-06-21 01:30 PM – 03:00 PM
Last modified: 2012-04-27

Abstract


We have previously reported that head position affects the perceived location of touch differently depending on the dynamics of the task the subject is involved in. When touch was delivered and responses were made with head rotated touch location shifted in the opposite direction of head position, consistent with body-centered coding. When touch was delivered with head rotated but response was made with head centered touch shifted in the same direction as head, consistent with gaze-centered coding. Here we tested whether moving the head in-between touch and response would modulate the effects of head position on touch location. Each trial consisted of three periods, in the first arrows and LEDs guided the subject to a randomly chosen head orientation (90 degrees left, right, or center) and a vibration stimulus was delivered. Next, they were either guided to turn their head or to remain in the same location. In the final period they again were guided to turn or to remain in the same location before reporting the perceived location of the touch on a visual scale using a mouse and computer screen. Reported touch location was shifted in the opposite direction of head orientation during touch presentation regardless of the orientation during response or whether a movement was made before the response. The size of the effect was much reduced compared to our previous results. These results are consistent with touch location being coded in both a gaze centered and body centered reference frame during dynamic conditions.

Conference System by Open Conference Systems & MohSho Interactive