José M. Azorín is the Director of the Brain-Machine Interface Systems Lab and Associate Professor of the Systems Engineering and Automation Department at Miguel Hernández University of Elche in Spain. He studied Computer Science at the University of Alicante and did his PhD on Tele-robotics at the Miguel Hernández University of Elche. His current research interests are Brain-Computer Interfaces (BCIs), Neurorobotics, Assistive Robotics and Rehabilitation Robotics. We had the chance to talk with him about his work and research.
José, can you tell me a bit more about your research projects, such as limb exoskeletons, BCIs for disabled people and Assistive/Rehabilitation Robotics?
José Azorín: “In the last years, we have been developing different projects related to BCIs and Assistive/Rehabilitation Robotics. In the BioMot project – Smart Wearable Robots with Bioinspired Sensory-Motor Skills, supported by the European Union, we were in charge of developing decoders to detect the intention to start/stop gait from EEG signals, to decode the kinematic parameters of the lower limbs from EEG signals, and to study different cognitive processes related to gait activity in EEG signals, such as the user’s attention or the possibility of detecting obstacles during walking.”
José Azorín: “In the Brain2Motion project – Exoskeletal – neuroprostheses hybrid robotic system for the upper limb controlled by a multimodal brain-neural interface, funded by the Spanish Ministry of Economy and Competitiveness, we implemented different BCIs that allow commanding an upper limb exoskeleton during rehabilitation for people with motor limitations. In this project, we also developed different BCIs based on motor imagery for controlling the movement of robot arms. We made further progress through a project funded by Mapfre Foundation (Spain) to allow people with severe brain damage to communicate basic needs.”
José Azorín: “Currently, we are working on the research project Associate – Decoding and stimulation of motor and sensory brain activity to support long term potentiation through Hebbian and paired associative stimulation during rehabilitation of gait, funded by the Spanish Ministry of Economy and Competitiveness. In this project, we are combining BCIs, neurostimulation techniques based on tDCS and lower limb exoskeletons to improve rehabilitation therapies for people with motor disabilities.”
Why are you motivated to study these topics?
José Azorín: “Before starting to research BCIs, I focused my research on tele-robotics devices, where people commanded a device to control a remote robot. However, I realized that people with motor limitations could not use these devices to control remote robots. Thus, I started to think about how to help people with motor limitations to control remote robots. First, we developed some projects where we detected eye movements of the users by processing their electrooculographic (EOG) signals. Afterwards, we decided to use EEG signals to command external devices, studying how we can use BCIs not only as an assistive technology but also as a rehabilitation technology. Currently, I think we need to continue researching these fields and explore how this technology can be applied to other areas. In my opinion, we don’t know yet the full potential of using BCIs.”
What are the difficulties of using BCIs in everyday lives, especially for people with disabilities?
José Azorín: “From my point of view, we need to develop more portable and plug-and-play BCIs. Most BCIs have been developed in the framework of research projects. Thus, they are using a big number of electrodes and non-dry solutions to show the potential of BCIs in scenarios like rehabilitation and assistance. In addition, considerable time is required to prepare the BCI system for each user. It is necessary to put the cap over the user and add gel, the user has to train, the classifier of the BCI has to be adjusted, and so on. However, persons with disabilities need to be able to use these BCIs with only the aid of their relatives, even if they are being aware of the technical details. This requires a strong effort in this project to make BCIs more usable in the real world.”
How could limb control via BCI become reality some day? What is necessary to realize it?
José Azorín: “We have shown in several research projects that to imagine moving our upper or lower limbs to control exoskeleton robots via BCIs. The combination of BCIs and wearable robots has a huge potential to foster the plasticity of the brain. Thus, it is a promising technology for rehabilitation therapies. Indeed, several centers are analyzing the use of this technology as a part of their therapies. On the other hand, the use of a BCI to command a robotic exoskeleton to assist the movement of people with motor limitations needs more development. In my opinion, if we want people to use this technology in activities of daily living, we need to implement robust BCIs, i.e. BCIs with an accuracy of 100% without any false positives.”
What is necessary to overcome limitations of BCI technology?
José Azorín: “On the one hand, different developments in the hardware are needed. First, it is important to further develop dry electrodes (like the g.Sahara electrodes) that can record EEG activity with signal quality similar to gel electrodes. Second, it is important to reduce the size of the amplifiers and to the duration of the batteries to improve their portability. These achievements will foster the use of this technology to a wide range of applications. On the other hand, new approaches in the processing software (feature extraction algorithms and classifiers) are required to improve the robustness of the BCIs.”
You have several g.USBamps with g.LADYbird active electrodes. What BCI applications are you creating?
José Azorín: “We have been using g.tec devices for many years in our research projects. We developed a BCI that allows users to control an upper limb exoskeleton using two different approaches: one based on motor imagery, and the other based on detecting movement intention. This application was tested on stroke patients during their rehabilitation. We implemented a BCI that provides the attention level of the user from EEG signals, even if the user is walking as part of the therapy. Thus, it is possible to know whether or not the person is focused on the rehabilitation therapy.”
José Azorín: “We developed a BCI that allows detecting the appearance of obstacles during walking from EEG signals. We implemented a BCI to detect the start and the stop of the gait through EEG signals. We created a multimodal interface that combines EEG and EOG signals to control a robot arm. Using this interface, it is possible to perform pick and place operations in a tridimensional workspace using a robotic arm.”
Figure reproduced from Costa Á, Iáñez E, Úbeda A, Hortal E, Del-Ama AJ, Gil-Agudo Á, et al. (2016): Decoding the Attentional Demands of Gait through EEG Gamma Band Features. PLoS ONE 11(4): e0154136, under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).
José Azorín: “We developed a BCI to command a planar robot arm based on motor imagery (see images below). We also implemented a BCI to control a robot arm using 4 mental tasks and a BCI to decode the upper limb kinematics from EEG signals.”
Figure reproduced from Úbeda et al. Journal of Neuro Engineering and Rehabilitation (2017) 14:9, DOI 10.1186/s12984-017-0219-0, under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).
Why are you using g.tec’s technology?
José Azorín: “Our lab is using this technology for two main reasons. First, the hardware is robust and we get high quality EEG signals. And second, the available SDKs allow us to develop our software easily.”
What is the most exciting part of your daily work? What inspires you?
José Azorín: “I have been always amazed by the possibilities of connecting our brains to external devices by processing our EEG signals. Currently, I believe that there is a huge potential in the use of this technology to improve current rehabilitation and assistive applications. This potential pushes me to continue my research. In addition, the possibilities of using BCI in new fields inspire me to work in this area and to establish new research targets.”
You are organizing a BCI Hackathon in Valencia this year. Could you tell us a bit more about the event?
José Azorín: “We are organizing a BCI hackathon in Valencia (Spain) on September 10-13, 2017, in the framework of the 2017 International Conference on Mobile Brain-Body Imaging (MoBI) and the Neuroscience of Art, Innovation and Creativity. In this hackathon, groups of participants will be organized to develop some artistic BCI projects. Each group will be supervised by a senior participant. This hackathon forms part of the BR41N.IO – The Brain-Computer Interface Designers Hackathon Series: www.BR41N.io. Some examples of projects that will be developed in the hackathon are:
- Design Headsets Using 3D Printing (provided by BR41N.IO): Let’s design a fancy and futuristic EEG headpiece with your own 3d printed parts. Let it move, light, blink, hold things or simply look fantastic.
- Dream Painting (provided by BR41N.IO): For the dream painting app, the user sleeps with a unicorn headset. When he wakes up he will get an image created according to his EEG signals. The participants task is to develop an interface based on P300 paradigms using intendiX and a graphic program.Wear a headset while you sleep, and get an image created according to your EEG signals when you wake up.
- intendiX Painting (provided by BR41N.IO): Create images by using your thoughts only!
- Sphero SPRK Control (provided by BR41N.IO): Control a Sphero with motor imagery by thinking of left or right hand movement to paint.
- Artistic BCI: Partner with a professional dancer attending the conference and make your brain-based choreography controlling lights and music using your EEG signals.
- Brain and Painting: Is there a relationship between EEG and painting? Can we modifiy some EEG bands during the painting process? Show it!
- Brain Music: Create music according to your EEG signals and have a jazz session with a performing musician attending the conference.”