Sensory attribute identification time cannot explain the common temporal limit of binding different attributes and modalities.
Waka Fujisaki, Shin'ya Nishida

Last modified: 2011-09-02

Abstract


An informative performance measure of the brain’s integration across different sensory attributes/modalities is the critical temporal rate of feature alternation (between, e.g., red and green) beyond which observers could not identify the feature value specified by a timing signal from another attribute (e.g., a pitch change). Interestingly, this limit, which we called the critical crowding frequency (CCF), is fairly low and nearly constant (~2.5 Hz) regardless of the combination of attributes and modalities (Fujisaki & Nishida, 2010, IMRF). One may consider that the CCF reflects the processing time required for the brain to identify the specified feature value on the fly. According to this idea, the similarity in CCF could be ascribed to the similarity in identification time for the attributes we used (luminance, color, orientation, pitch, vibration). To test this idea, we estimated the identification time of each attribute from [Go/ No-Go choice reaction time – simple reaction time]. In disagreement with the prediction, we found significant differences among attributes (e.g., ~160 ms for orientation, ~70 ms for pitch). The results are more consistent with our proposal (Fujisaki & Nishida, Proc Roy Soc B) that the CCF reflects the common rate limit of specifying what happens when (timing-content binding) by a central, presumably postdictive, mechanism.

Conference System by Open Conference Systems & MohSho Interactive