Skip to main content

Verified by Psychology Today

Cognition

Neural Implants and the Future of Language

Brain implants predicted to revolutionize the future of language learning.

Key points

  • Current neural implant research aims to replace all languages with a single, universal language.
  • Language learning is not just a technological problem, it also involves embodied experience, which gives words their meanings.
  • If language learning could be offloaded to brain implants, it may irrevocably alter what it means to be human.

Imagine a near future in which people no longer learn language. Instead we stream it, much like many of us, today, stream music or movies, on demand to our computers. But in the future, we would stream it from the internet directly to a neural implant in our heads. This would entail brain-computer interface technology, a language chip, sewn into the cortex of the brain, as well as a modem chip, perhaps located behind the ear, capable of receiving language streaming signals.

This is exactly the near future imagined in my forthcoming science-fiction book The Babel Apocalypse. But in fact, such a near-future possibility is not limited to the realm of science fiction. It is very much grounded in current technology, and ongoing research connecting human brains with computers.

But such technology raises very real ethical questions, such as the potential for the use and abuse of brain chips by big government or big tech, as envisaged in The End of Sleep, to the potential for individual brains to be hacked. Ultimately, brain implants, impacting an ingredient so vital to what it means to be human, namely language, has the far-reaching consequences for the human experience including how we think, feel, learn, and interact with each other and the world around us.

The psychological benefits of learning language(s)

In the English-speaking world, where the global reach of English is taken for granted, it is widely assumed that everyone speaks just one language, their mother tongue. But outside the Anglosphere, the reality is often markedly different, where speaking two or more languages—bilingualism or even multilingualism—is often the norm.

Indeed, well over half the world’s population speaks at least two languages, often more. In Johannesburg, South Africa, for instance, it is common for many people to speak five languages.

Indeed, from a psychological perspective, it is well-established that speaking and learning another language brings considerable benefits to capacity of our brain, over and above the obvious practical benefits involved in being able to more widely communicate.

Speaking two or more languages makes you smarter—ranging from being better able to grasp new skills, to improving the ability to focus and concentrate. It also improves memory. And in your twilight years, it provides better protection against dementia. Bilinguals are less likely to suffer from cognitive impairments in later life than their monolingual peers. In his Ted talk on 4 reasons to learn a new language, linguistics professor John McWhorter compelling presents some of the benefits.

Could language learning be replaced by technology?

The short answer, at least according to Elon Musk, is yes. In 2016 Musk, along with others, founded Neuralink, a neurotechnology company that is working on develop implantable implantable brain-computer interface chips. In the first instance, Neuralink is working on technology that would allow paraplegics to communicate with a computer by making mouse clicks, or by operating a smartphone, without the need for actual motor movement. A useful podcast primer on Neuralink’s technology can be found here. While human trials have not yet begun, the technology involves implanting computer chips in the brain, enabling, at least in principle, wireless communication with external computers.

While brain implants are not new—research on human neural implants dates back to the 1970s—today neural implants are increasingly common to to circumvent brain areas that have become dysfunctional. Such implants function as a biomedical prosthesis in cases of stroke or head injury. Other uses include implants that provide deep brain stimulation functions in cases of Parkinson’s disease or even to treat depression.

But what is wholly new about Neuralink is that Elon Musk’s vision is for brain implants to go beyond medical remedy, but to aim at enhancing the human experience. A case in point is language. Elon Musk has recently claimed that brain implants could soon spell the end for learning language the old fashioned way. He has gone so far as to claim that perhaps as soon as 5 to 10 years, human languages could be replaced by a single universal language, using brain implants, making communication more effective and convenient.

While Musk’s claims and predictions have a track-record of not materializing, at least not within the time-frame envisaged, what would be the psychological implication of this? That is, if, in the future, the technology exists to offload the language learning problem, as envisaged in The Babel Apocalypse, to use technology to stream language directly into our heads, what does this mean for how we communicate? And what does it mean for how we think, feel and experience the world around us?

The nature of meaning

The essential function of language is to facilitate communication. Language does this, remarkably well, by encoding and externalizing concepts—thoughts and ideas—by turning amorphous thought, locked inside our heads, into something we can verbalize or signal (using written or typed text, signed gestures, and so on), in order to get “our ideas across” to someone in the real world. Psycholinguists refer to this primary function of language as signaling a communicative intention. But more prosaically, we can refer to this process as that of conveying meaning. We use language to get ideas across.

But research in the psychological sciences has conclusively shown, over the last couple of decades, that the meanings conveyed by language are grounded in our world of lived, everyday experience. For instance, when I say: “Hammer that nail in the wood,” you understand those words, in part, by activating the motor areas of your brain that know what it feels like to hold a hammer and strike it against a nail.

In other words, understanding language involves activating the embodied experience of the very situation that the words are meant to convey. In short, meaning associated with words is not some abstract thing. It depends on what is technically known as an embodied simulation of the very action or experience the words convey.

Psychologically speaking, given that the meanings conveyed by language are dependent upon actual experiences, the very brain states that encode lived experience, the question remains: How would our brave new world of brain implants facilitate language, if the words are not tied to those same brain states in the living brain that forms them? If language is streamed to us via a computer, perhaps from internet in space, where do the words derive their meanings from?

The Symbol Grounding Problem of meaning

In the psychological sciences, this is known as the Symbol Grounding Problem of meaning. Words are not abstract entities. They can be used to do things in the world—make us fall in love, buy a new product, make someone laugh—because of the meanings attached to them. But if we offload language to a computer, the problem is how we hook it up to the meaning that remains resolutely inside our heads?

Psychologically speaking, brain implants to simplify language use and language learning would inevitably change the nature of language and communication. They would also change the nature of meaning and in the process what it means to be human. Even if such technology were possible, would, it, therefore, even be desirable?

advertisement
More from Vyvyan Evans Ph.D.
More from Psychology Today