Visually-guided locomotion is important for autonomous robotics. However, there are several di culties,for instance, the robot locomotion induces head shaking that constraints stable image acquisition and thepossibility to rely on that information to act accordingly. In this work, we propose a combined approachbased on a controller architecture that is able to generate locomotion for a quadruped robot and a geneticalgorithm to generate head movement stabilization. The movement controllers are biologically inspiredin the concept of Central Pattern Generators (CPGs) that are modelled based on nonlinear dynamicalsystems, coupled Hopf oscillators. This approach allows to explicitly specify parameters such as ampli-tude, o set and frequency of movement and to smoothly modulate the generated oscillations accordingto changes in these parameters. Thus, in order to achieve the desired head movement, opposed to theone induced by locomotion, it is necessary to appropriately tune the CPG parameters. Since this is anon-linear and non-convex optimization problem, the tuning of CPG parameters is achieved by using aglobal optimization method. The genetic algorithm searches for the best set of parameters that generatesthe head movement in order to reduce the head shaking caused by locomotion. Optimization is doneo ine according to the head movement induced by the locomotion when no stabilization procedure wasperformed. In order to evaluate the resulting head movement, a tness function based on the Euclidiannorm is investigated. Moreover, a constraint handling technique based on tournament selection was im-plemented. Experimental results on a simulated AIBO robot demonstrate that the proposed approachgenerates head movement that reduces signi cantly the one induced by locomotion.
展开▼