A new wearable brain-machine interface (BMI) system could improve quality of life for people with motor dysfunction or paralysis, even those with lock-down syndrome — when a person is fully conscious but unable to move or communicate.
An international, multi-institutional team of researchers led by the Woon-Hong Yeo Laboratory at Georgia Institute of Technology has integrated scalp radioelectronics and virtual reality into a BMI system that allows the user to imagine an action and wirelessly control a wheelchair or robotic arm.
The team, which included researchers from the University of Kent (UK) and Yonsei University (Republic of Korea), describes a new motion picture-based BMI system this month in the journal. advanced science.
“The main advantage of this system for the user, compared to what is currently available, is that it is soft and comfortable to wear, and does not contain any wires,” said Yu, assistant professor at the George W. Woodruff College of Mechanics. engineering.
BMI systems are a rehabilitation technology that analyzes a person’s brain signals, translates this neural activity into commands and converts intentions into actions. The most common non-invasive method for obtaining these signals is ElectroEncephaloGraphy, EEG, which typically requires a heavy electrode skull cap and an interlocking network of wires.
These devices generally rely heavily on gels and pastes to help maintain skin contact, require long preparation times, and are generally inconvenient and uncomfortable to use. Devices also often have poor signal acquisition due to deterioration of materials or artifacts of movement – extra “noise” that might be caused by something like gnashing of teeth or eye blinking. This noise appears in the brain data and must be filtered.
The Yeo-designed portable EEG system, which integrates imperceptible micro-needle electrodes with soft wireless circuits, provides improved signal acquisition. Accurate measurement of brain signals is critical to determining what actions a user wants to perform, so the team combined a powerful machine learning algorithm and a virtual reality component to address this challenge.
The new system has been tested on four people, but has not yet been studied with disabled individuals.
“This is just a preliminary demonstration, but we are pleased with what we saw,” noted Yu, director of the Georgia Tech Center for Human-Centered Interfaces and Engineering at the Institute of Electronics and Nanotechnology, and a member of the Petit Institute for Bioengineering and Biosciences.
Yeo’s team originally introduced the EEG soft and wearable brain-machine interface device in a 2019 study published in The intelligence of nature’s machine. The lead author of this work, Musa Mahmoud, was also the lead author of the team’s new paper.
“The new brain-machine interface uses a completely different paradigm, involving imagined motor movements, such as holding either hand, freeing the subject from having to look at a lot of stimuli,” said Mahmoud, a PhD student at Yi University. laboratory.
In the 2021 study, users demonstrated precise control of VR exercises using their thoughts – their motion pictures. Visual cues enhance practicality for both the user and the researchers collecting the information.
“The hypothetical prompts have proven very useful,” Yu said. “It speeds up and improves user interaction and accuracy. We were able to record high-quality, continuous activity of the motion images.”
According to Mahmoud, future work on the system will focus on optimizing electrode placement and more advanced integration of stimulation-based EEG, using what they learned from the last two studies.
The wearable brain-machine interface can control a wheelchair, vehicle or computer أو
Musa Mahmoud et al., Wireless scalp soft electronics and a virtual reality system for kinetic image-based brain-machine interfaces, advanced science (2021). DOI: 10.1002/advs.202101129
Musa Mahmoud et al., Portable and wireless global brain-machine interfaces enabled by flexible scalp electronics and a deep learning algorithm, The intelligence of nature’s machine (2019). DOI: 10.1038 / s4256-019-0091-7
Provided by Georgia Institute of Technology
the quote: Wearable Brain Machine Interface Turns Intentions into Actions (2021, July 21) Retrieved on July 22, 2021 from https://techxplore.com/news/2021-07-wearable-brain-machine-interface-intentions-actions.html
This document is subject to copyright. Notwithstanding any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.