Multisensory learning: from calibration, to associative learning to perceptual learning
Ladan Shams, Robert Jacobs, Aaron Seitz, Robyn Kim
Talk
Time: 2009-07-02 12:10 PM – 12:30 PM
Last modified: 2009-06-04
Abstract
Multisensory learning and adaptation can be classified into three categories: A) one modality calibrating another modality, B) the two modalities becoming associated together, and C) one modality facilitating learning in another modality. We will present a Bayesian network framework that unifies all three types of phenomena, and present experimental results exemplifying each of the three categories of learning. Although these three forms of learning appear very different from each other, we argue that they all fit within the same Bayesian network framework. For all three classes of phenomena, this computational viewpoint suggests that the crossmodal signal provides feedback and error information that can be used by each individual sensory system for learning. As an example of class A, we will discuss a study showing that haptic information can teach visual modality by adjusting the relative weights of the different visual depth cues (Atkins, Fiser, & Jacobs, 2001). As an example of class B, we will present data showing how associations between auditory and visual stimuli can occur automatically in the context of a statistical learning paradigm (Seitz, Kim, van Wassenhove, & Shams, 2007). Finally, as an example of class C, we will present results showing facilitation of visual perceptual learning by sound. We show this in the context of a motion detection task where subjects are presented with correlated auditory and visual motion directions. When auditory and visual motion directions are congruent (i.e., in the same direction) during training subjects show greater improvements in their ability to detect the visual motion direction (in the absence of sound) compared to when they are trained with only visual stimuli (Seitz, Kim, & Shams, 2006) or when trained with incongruent auditory and visual motion directions (Kim, Seitz, & Shams, 2008). Together these results show how our brain is able to associate stimuli between the senses and then use these associations as teaching signals to improve learning within each modality.