Exploratory project in which a non-invasive brain machine interface device was used to detect EEG signal patterns and translate them into robot navigation commands. Preliminary EEG signal pattern training requires the user to concentrate on a specific task for certain amount of time (i.e. 10 sec) while EEG signals are recorded. Discrete Fourier Transform is obtained form wave signals and particular samplings are used to train a support vector machine that is later used for pattern classification.
We present a software interface that allows a user to control different types of robotic systems by using a Brain-Machine Interface. Unlike common devicespecific BMI systems, our software architecture maps simple EEG-based commands to diverse functionalities depending on the robotic platform, so the user does not have learn to generate new EEG commands for different robots. The graphic user interface provides a mechanism that allows the user to navigate through menus using EMG signals (i.e. eye-blink), and then execute robot commands using EEG signals. Our software is based on a modular design that allows the integration of new robotic platforms with easy customization. Our current prototype explores the controllability of a humanoid robot, a flying robot and a pan-tilt robot using the proposed software interface.
Christian I. Penaloza, Yasushi Mae, Kenichi Ohara, and Tatsuo Arai: "Software Interface for Controlling Diverse Robotic Platforms using BMI", IEEE/SICE International Symposium on System Integration, Osaka, Japan. Fukuoka, Japan. December 16-18, 2012.
This research presents a Non-Invasive Brain- Machine Interface (BMI) that allows persons, who have suffered from motor paralysis conditions, to control appliances in a hospital room by using only electromyogram signals (EMG) generated by muscle contractions such as eyebrow movement. The novelty of our system compared to other BMI applications is that our system gradually learns user actions and preferences under certain room environment conditions (temperature, illumination, etc.) and brain states (i.e. awake, sleepy, etc.). By providing learning capabilities to the system, the system achieves certain degree of automation and patients are relieved from mental fatigue or stress caused by continuously controlling appliances using a BMI. We present a hierarchical architecture that allows the user to select appliances (window, lights, etc.) and operate them with minimum effort. Our system uses an extended version of the Bayes Point Machine approach trained with Expectation Propagation to approximate a posterior probability from previously observed user actions, which leads to a predictive distribution over a new combination of brain states and environmental conditions.
Christian I. Penaloza, Yasushi Mae, Kenichi Ohara, and Tatsuo Arai: "BMI-based Learning System for Appliance Control Automation", IEEE International Conference on Robotics and Automation (ICRA 2013), Karlsruhe, Germany. May 6 - 10, 2013.