Human sounds facilitates conscious processing of emotional faces
Bernard M. C. Stienen, Fiona N. Newell

Date: 2012-06-20 02:30 PM – 04:00 PM
Last modified: 2012-05-06

Abstract


The interaction of audio-visual signals transferring information about the emotional state of others may play a significant role in social engagement. There is ample evidence that recognition of visual emotional information does not necessarily depend on conscious processing. However, little is known about how multisensory integration of affective signals relates to visual awareness. Previous research using masking experiments has shown relative independence of audio-visual integration on visual awareness. However, masking does not capture the dynamic nature of consciousness in which dynamic stimulus selection depends on a multitude of signals. Therefore, we presented neutral and happy faces in one eye and houses in the other resulting in perceptual rivalry between the two stimuli while at the same time we presented laughing, coughing or no sound. The participants were asked to report when they saw the faces, houses or their mixtures and were instructed to ignore the playback of sounds. When happy facial expressions were shown participants reported seeing fewer houses in comparison to when neutral expressions were shown. In addition, human sounds increase the viewing time of faces in comparison when there was no sound. Taken together, emotional expressions of the face affect which face is selected for visual awareness and at the same time, this is facilitated by human sounds.

Conference System by Open Conference Systems & MohSho Interactive