Assistive robotic technologies that use neural interface systems are designed to allow people with limited mobility to assert control with signals directly from their brains. These robotic systems require detection and analysis of raw brain signals, machine learning methods to extract these signals into useful commands, and the development of an interface between neural signals and robot control. In this paper, a method for controlling a 4-degree of freedom RRRR WAM robotic arm with alpha brain waves of a test subject obtained via electroencephalography (EEG) is presented. The OpenBCI system electrodes and board are ussubed to detect alpha waves and transmit them to digital signal. A robust serial communication interface is developed to convert OpenBCI data into robot commands. An accelerometer embedded in the OpenBCI board is used to implement left-right motion of the robot. To assess the performance of the system, we successfully demonstrate two primary tasks: alpha wave robot control and alpha wave and accelerometer robot control. The methods can be readily extended to include control from other brain regions and additional robotic tasks, paving the way for more complex interactions between robots and human brains.
展开▼