A brain machine interface (BMI) facilitates the control of machines through the analysis and classification of signals directly from the human brain. Using an electroencephalograph (EEG) to detect neurological activity permits the collection of data representing brain signals without the need for invasive technology or procedures. A 14-electrode EPOC headset produced by the Emotiv Company is used to capture live data, which can then be classified and encoded into control signals for a 7-degree-of-freedom robotic arm. The collected data is analyzed in using an independent component analysis (ICA) based feature extraction and a neural network classifier. The collected EEG data is classified into one of four control signals: lift, lower, rotate clockwise, and rotate counter-clockwise. Additionally, the system watches the collected data for electromyography (EMG) signals indicative of movement of the facial muscles. Detections are used to incorporate two additional control signals: open and close. A personal set of EEG data patterns is trained for each individual, with each control signal requiring only a few minutes to train initially. EMG signal detections are measured against a generic threshold for all users. Once a user has trained their personal data into the system any positive detections trigger a signal to the interfaced robotic arm to perform a corresponding, discrete action. Currently, subjects are able to repeatedly execute two EEG commands with accuracy within a short period of time. As the number of EEG based commands increases, the training time required for accurate control increases significantly. EMG based control is almost always immediately responsive. In order to extend the range of available controls beyond a few discrete actions, this research intends to incorporate and refine the algorithmic steps of classification and detection to shift an increased percentage of the burden of training onto the computer.
展开▼