This study continues a line of research in which the motor output of the eye and the hand during a reach task is increasingly dissociated, in order to gain insight into the neural control of the eye-hand coordination in situations requiring the integration of rule-based information for action. Nine neurologically healthy participants made sliding finger movements over a clear touch-sensitive screen to displace a cursor from a central target to one of four peripheral targets. The visual targets and the cursor were projected onto a horizontal surface on top of and in alignment with the touch screen. Movements were made under conditions where the direction of cursor motion (visual feedback) was either the same as that of the finger movement, or rotated 180° from the direction of finger movement. At the same time, subjects' eyes were required to either maintain central fixation, or to make saccade movements away from the cued target direction. We observed distinct movement latencies, velocities, movement times, movement path curvatures as well as endpoint errors for both the eye and the hand across dissociated task conditions. Interestingly, we observed that when followed by a hand movement in the opposite direction, the saccadic eye movement showed target-dependent path curvatures that resemble those of the corresponding hand movement. This occurred despite the fact that the eye movement finished before the hand movement started, and that compared to those of the hand movement, the inertial and biomechanical constraints of the eye movement are almost negligible in the four cardinal directions. These results provide novel evidence to the notion that the behaviour of the eye and the hand movement are spatially coupled during the reaching/pointing movement.
展开▼