Skip to main content

Verified by Psychology Today

Artificial Intelligence

Autos to Integrate AI-based Brain-Computer Interfaces (BCIs)

BCI to enable drivers to use thoughts to control select car functions.

Geralt/Pixabay
Source: Geralt/Pixabay

Brain-computer interfaces (BCIs), also known as brain machine interfaces (BMIs), convert brain activity into outputs to control external devices for those who have lost or have impaired movement or speech. Now companies are actively pursuing using BCIs for anyone, not just for those who are impaired. Earlier this week at the IAA Mobility summit, car maker Mercedes-Benz AG announced plans to integrate a brain-computer interface in its future VISION AVTR concept vehicle.

“Selecting the navigation destination by thought control, switching the ambient light in the interior or changing the radio station: Brain-computer interfaces (BCI) enable this new form of machine control,” wrote Mercedes-Benz AG in a statement.

Mercedes Benz AG
Source: Mercedes Benz AG

The brain-computer interface technology that Mercedes-Benz AG plans to use is in partnership with NextMind, a neurotechnology startup with investors that include David Helgason, Bpifrance, Sune Alstrup (Johansen), Nordic Makers, and Sisu Game Ventures according to Crunchbase. NextMind was founded as a spin-out from the Paris lab of cognitive neuroscientist Sid Kouider, a professor at École Normale Supérieure.

NextMind’s BCI converts brain activity into digital commands that enable the user to control visual interfaces in real time. It performs this by tagging visual objects with NeuroTags, which are faint graphic overlays that are optimized for the human brain’s visual cortex.

In neuroanatomy, the visual cortex is in the occipital lobe of the primary cerebral cortex of the brain where visual information is received from the eyes and processed. Visual information from the retinas is received, integrated, and processed by the visual cortex. Each hemisphere of the human brain has its own visual cortex that processes information from the opposite eye. For example, the left cortical area processes visual information from the right eye, and vice versa. There are five areas of the visual cortex (V1-V5), that differ in structure and function.

NextMind applies non-invasive EEG (electroencephalogram) technology to read electrical signals from the brain. Artificial intelligence (AI) machine learning algorithms decode the neural electrical signals to identify the user’s visual focus. Once the user’s visual focus is identified, the selected object reacts visually and executes the associated functional command.

Mercedes Benz AG
Source: Mercedes Benz AG

“Mercedes-Benz is setting another milestone in the merging of man and machine with the research and development of brain-computer interface applications in cars,” stated Britta Seeger, Member of the Board of Management of Daimler AG and Mercedes-Benz AG, responsible for Sales in a statement. “BCI technology has the potential to further enhance driving comfort in the future, for example. Mercedes-Benz has always pioneered intelligent, innovative solutions to provide our customers with the best product and service experience. BCI technology works completely independently of speech and touch. This opens up revolutionary possibilities for intuitive interaction with the vehicle.”

Copyright © 2021 Cami Rosso All rights reserved.

advertisement
More from Cami Rosso
More from Psychology Today
More from Cami Rosso
More from Psychology Today