Human-computer-interface for controlling the assistive technology device

Share on facebook
Share on google
Share on twitter
Share on linkedin
Illustration Human-Computer-Interface. (Source: https://www.canterbury.ac.nz/)

Imagining the motion without actually making the movement is called motor imagery. Motor imagery has been extensively developed for the Brain Computer Interface, a new technology that utilizes brain waves to control assistive technology. Assistive technologies such as wheelchairs for people with cerebral palsy, spinal cord injuries and multiple sclerosis can help improve quality of life by making movement easier.

However, converting motor imagery into an input for the Brain Computer Interface in controlling the movement of assistive devices is not easy. Previously it required a large amount of practice because not everyone can easily distinguish one movement from another just by imagining the movements. It requires concentration and sometimes users can forget or mistake one another’s movements without helpful visual illustrations. Even though it is difficult, the ability to distinguish one movement from another is needed to control assistive devices. Therefore it requires assistance from other body movements, such as eye movements, blinking or facial expressions.

These movements have an action potential that will appear in the EEG wave, which is called motion related potential. This movement is often seen as noise or artifact that needs to be removed from the EEG signal, but this movement can be used to assist the participant in distinguishing movements on assistive devices through a computer integrated with the body, known as the human-computer interface.

Facial expressions are a way to convey non-verbal communication and generally reflect a person’s emotions. Based on previous research which states that EEG can detect a person’s emotions based on facial expressions, EEG recording involving changes in facial expressions has been developed. This study also uses facial expressions to control a simple prototype as a simulator for a wheelchair. Changes in facial expression also correlated with changes in Mu and Beta waves. Facial expressions or planned movements can reduce Mu wave activity (8-12 Hz) and Beta waves (18-26 Hz). The Mu and Beta waves will decrease when the facial expression changes. The lowest Mu wave is when you open your mouth, while the lowest Beta wave is when a person becomes emotional, such as when being afraid, mad, and happy.

This study tries to distinguish three movements, namely forward, backward and stop based on the EEG signal recorded when the participant raises his eyebrows, frowns and blinks for each movement. EEG is recorded using EMOTIV EPOC + for approximately 10 seconds, then the signal is processed using Wavelet transform to obtain power from Mu and Beta waves as input for the Extreme Learning Machine in classifying forward, backward or stopping movements.

Based on the results, it is known that facial expressions can assist in differentiating movement in assistive devices. This of course will make it easier for users who have difficulty concentrating on imagining the movement in an imaginary manner and it will not make users tired from constant concentration while moving the device.

The results of this study were presented at The 2 nd International Conference on Physical Instrumentation and Advance Materials (ICPIAM) 2019 in Surabaya, Indonesia on October 22, 2019. The following is a link from the article:

https://aip.scitation.org/doi/abs/10.1063/5.0034256

O. N. Rahma, M. N. Kurniawati, A. Rahmatillah, K. Ain, “Human-Computer-Interface for controlling the Assistive Technology Device” AIP Conf. Proc., vol. 2314, issue 1, December 2020, doi: 10.1063/5.0034256.

Author: Osmalina Nur Rahma

Berita Terkait

UNAIR News

UNAIR News

Media komunikasi dan informasi seputar kampus Universitas Airlangga (Unair).