Skip to main content

Verified by Psychology Today

Brain Computer Interface

AI Deep Learning Improves Brain-Computer Interface Performance

A deep neural network outperforms traditional decoders for brain-computer interfaces.

Source: M0DY/Pixabay
Source: M0DY/Pixabay

A new study published in PNAS Nexus by researchers at Carnegie Mellon University (CMU) demonstrates how a noninvasive brain-computer interface (BCI) powered by artificial intelligence (AI) deep learning (DL) can enable humans to continuously control a cursor using thoughts.

“This study demonstrates the potential of using DL-based decoders for online BCI decoding in challenging tasks and shows that subjects can achieve strong performance with these models,” wrote corresponding author Dr. Ben He, Trustee Professor of Biomedical Engineering, Professor of the Neuroscience Institute, and Professor by courtesy of Electrical and Computer Engineering at Carnegie Mellon University, along with co-authors Dylan Forenzo, Hao Zhua, Jenn Shanahan, and Jaehyun Lima.

Brain-computer interfaces enable users to control external devices using the mind. BCIs are assistive neurotechnology that may help those with impaired speech or who are paralyzed to control external devices such as robotic limbs, operate a motorized wheelchair, and control a computer cursor. The brain-computer industry is projected to reach $6.2 billion by 2030, according to Grand View Research.

The field of brain-computer interfaces has been gaining more mainstream media attention in recent years, largely due to South African-born American entrepreneur Elon Musk co-founding Neuralink in 2016 with Max Hodak and others. In January 2024, Musk announced on his social media site X (formerly Twitter) that his BCI company Neuralink was implanted in a human. But this was not the first time humans have been implanted with brain-computer interfaces.

Decades prior, the first people to receive brain-computer interfaces were an ALS patient and a brainstem stroke survivor. In 1998, Emory University neurosurgeon Dr. Roy E. Bakay and neuroscientist Dr. Phillip R. Kennedy implanted neurotrophic electrodes in the motor cortex area of the patients which enabled them to control a cursor on an external computer using their thoughts to prompt computerized speech.

BCIs use artificial intelligence to decode complex brain activity and predict the user’s intended action. The brain activity may be collected using an invasive (requires surgery to implant) or noninvasive methods such as electroencephalography (EEG). EEGs record the brain’s electrical activity. The brain’s electrical activity is recorded by electrodes placed on the scalp and an EEG machine amplifies the signals, then produces graphical output.

In the 1920s, German neuropsychiatrist Hans Berger (1873-1941) conducted the first EEG recordings of the human brain and published a series of scientific papers. Berger coined the term “electroencephalogram.”

Fast forward to the present day, and neurologists use EEG testing as a safe method to help diagnose epilepsy, brain injuries, sleep disorders, psychosis, stroke, tumors, dementia, Alzheimer’s disease, Creutzfeldt-Jakob disease, and other conditions. In 2023, the global market size for electroencephalography devices was $1.21 billion, a figure that is expected to nearly double to $2.39 billion by 2030.

“Among noninvasive methods, EEG is particularly well suited for BCI systems due to its high temporal resolution, portability, and relatively low cost,” the CMU researchers wrote.

To conduct this research, the brain activity of healthy human participants was recorded noninvasively using Compumedics Neuroscan’s 64-channel EEG Quik-Cap with the SynAmps 2/RT system to amplify the signals. The participants mentally tracked a continuously moving object on a screen as an EEG cap placed on their heads recorded their neural signals. A deep neural network decodes the neural recordings and predicts the participant’s intentions.

The scientists compared their newly developed AI deep learning algorithm to existing deep learning decoders with encouraging results.

“We rigorously evaluated the DL-based decoders in a total of 28 human participants, and found that the DL-based models improved throughout the sessions as more training data became available and significantly outperformed a traditional BCI decoder by the last session,” the scientists reported.

The researchers demonstrated how AI deep learning, combined with a noninvasive brain-computer interface, may pave the way for high-performance devices to not only assist paralyzed patients but also improve the quality of daily living for able-bodied people in the future.

Copyright © 2024 Cami Rosso All rights reserved.

advertisement
More from Cami Rosso
More from Psychology Today
More from Cami Rosso
More from Psychology Today