An important component of animal cognition and goal-oriented behavior is the integration of multiple sources of information. Although, in the animals like rats, visual cues seem to be an important source, their navigational capabilities do not cease in the dark, suggesting that the other modalities such as proprioception contributes as well. However, it is not yet clear how the integration of multiple sources of information could be performed to assure optimal performance. Here we take suggestions form the domain of multi-modal speech perception to investigate how in the context of the Distributed Adaptive Control (DAC) architecture, view-dependent visual cues and instantaneous allocentric spatial information can be integrated. We evaluate our model in a robot foraging task. On the basis of our results we suggest mechanisms that could facilitate the transformation of egocentrically defined actions into allocentric behavior.
展开▼