Audio-visual interactions in the perception of intention from actions

Hanni Kiiski, Ludovic Hoyet, Katja Zibrek, Carol O'Sullivan, Fiona N. Newell

Last modified: 2013-05-05

Abstract


Although humans can infer other people’s intentions from their visual actions (Blakemore, & Decety, 2001), it is not well understood how auditory information can influence this process. We investigated whether auditory emotional information can influence the perceived intention of another from their visual body motion. Participants viewed a set of videos which presented point light displays (PLDs) of 10 actors (5 male) who were asked to portray a ‘hero’ or a ‘villain’ character. Based on a 2-AFC design, participants categorised each visual character as having ‘good’ or ‘bad’ intentions. Response accuracy and speed were recorded. Performance on visual-only trials exceeded chance performance suggesting that participants were efficient at judging intentions from PLDs. We then paired auditory vocal stimuli which were associated with either positive (happy) or negative (angry) emotions with each of the PLDs. The auditory stimuli were taken from Belin et al., (2008) and consisted of nonverbal bursts (‘ah’) recorded from 10 actors (5 male). Each vocalisation was randomly paired with a sex-matched PLD (60 PLD-voice combinations). We found that both the categorisation responses and the speed of those responses were affected by the inclusion of the auditory stimuli. Specifically, reaction times were facilitated when the auditory emotion (positive or negative) matched the perceived intentions (good or bad respectively) relative to unisensory conditions. Our findings suggest important interactions between audition and visual actions in perceiving intentions in others and are consistent with previous findings of audio-visual interactions in action-specific visual regions of the brain (e.g. Barraclough et al. 2005).

Keywords


audio-visual interaction; intention perception; action perception

References


Belin, P., Fillion-Bilodeau, S, & Gosselin, F. (2008). The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing. Behavior Research Methods, 40(2), 531-539. Blakemore, S-J., & Decety, J. (2001). From the perception of action to the understanding of intention. Nature Reviews, 2, 561-567. Barraclough NE, Xiao D, Baker CI, Oram MW, Perrett DI. (2005). Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J Cogn Neurosci.; 17(3):377-91.

Conference System by Open Conference Systems & MohSho Interactive