An autonomous robot involved in missions should be able to generate, update and process its own actions. It is not plausible that the meaning of the actions used by the robot is given form the outside of the system itself. Rather, this meaning should be anchored to the world through the perceptual abilities of the robot. We present an approach to conceptual action representation based on a "conceptual" level that acts as an intermediate level between symbols and data coming form sensors. Symbolic representations are interpreted by mapping them on the conceptual level through a mapping mechanism based on artificial neural networks.
展开▼