Analyzing haptic and visual object categorization of parametrically-defined shapes
Nina Gaißert, Christian Wallraven, Heinrich H Bülthoff
Poster
Last modified: 2008-05-09
Abstract
To investigate multi-sensory, perceptual representations of three-dimensional object spaces, we generated complex, shell-shaped objects by altering three parameters defining shell shape. For haptic experiments, 3D-printed plastic models were freely explored by blindfolded participants with both hands. For visual experiments, we used 2D images of these objects.
Previously, we reported results of a similarity rating task in which we split the three-dimensional object space into three orthogonal planes. Multidimensional scaling (MDS) of the pair-wise similarity ratings showed that participants reproduced the three planes almost exactly both visually and haptically. Here, we report results of a categorization task in which all objects were presented simultaneously either visually or haptically to ten participants who then categorized the objects in as many groups as they liked to.
MDS analyses revealed a three-dimensional perceptual space underlying both visual and haptic data. Interestingly, the three dimensions corresponded to the parameters of shell shape with a different weighting of the dimensions in the visual and the haptic condition. Our results show that humans are able to reproduce the underlying parameters of a complex, three-dimensional object space in a similarity and categorization task using either visual or haptic modalities surprisingly well.
Previously, we reported results of a similarity rating task in which we split the three-dimensional object space into three orthogonal planes. Multidimensional scaling (MDS) of the pair-wise similarity ratings showed that participants reproduced the three planes almost exactly both visually and haptically. Here, we report results of a categorization task in which all objects were presented simultaneously either visually or haptically to ten participants who then categorized the objects in as many groups as they liked to.
MDS analyses revealed a three-dimensional perceptual space underlying both visual and haptic data. Interestingly, the three dimensions corresponded to the parameters of shell shape with a different weighting of the dimensions in the visual and the haptic condition. Our results show that humans are able to reproduce the underlying parameters of a complex, three-dimensional object space in a similarity and categorization task using either visual or haptic modalities surprisingly well.