The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Artificial intelligence translates thoughts into text using brain implant

Mind-reading AI can turn neural activity into sentences with a 97 per cent accuracy rate

Anthony Cuthbertson
Tuesday 31 March 2020 15:10 BST
Comments
Facebook and Elon Musk's Neuralink are among the companies working on telepathic technologies
Facebook and Elon Musk's Neuralink are among the companies working on telepathic technologies

Scientists have developed an artificial intelligence system that can translate a person’s thoughts into text by analysing their brain activity.

Researchers at the University of California, San Francisco, developed the AI to decipher up to 250 words in real time from a set of between 30 and 50 sentences.

The algorithm was trained using the neural signals of four women with electrodes implanted in their brains, which were already in place to monitor epileptic seizures.

The volunteers repeatedly read sentences aloud while the researchers fed the brain data to the AI to unpick patterns that could be associated with individual words. The average word error rate across a repeated set was as low as 3 per cent.

“A decade after speech was first decoded from human brain signals, accuracy and speed remain far below that of natural speech,” states a paper detailing the research, published this week in the journal Nature Neuroscience.

“Taking a cue from recent advances in machine translation, we trained a recurrent neural network to encode each sentence-length sequence of neural activity into an abstract representation, and then to decode this representation, word by word, into an English sentence.”

The average active vocabulary of an English speaker is estimated to be around 20,000 words, meaning the system is a long way off being able to understand regular speech.

Researchers are unsure about how well it will scale up, as the decoder relies on learning the structure of a sentence and using it to improve its predictions. This means that each new word increases the number of possible sentences, therefore reducing the overall accuracy.

“Although we should like the decoder to learn and to exploit the regularities of the language, it remains to show how much data would be required to expand from our tiny languages to a more general form of English,” the paper states.

One possibility could be to combine it with other brain-computer interface technologies that use different types of implants and algorithms.

Last year, a report by the Royal Society claimed that neural interfaces linking human brains to computers will enable mind reading between people.

The report cited technologies currently being developed by Elon Musk’s Neuralink startup and Facebook, who describe cyborg telepathy as “the next great wave in human-oriented computing”.

Neuralink says learning to use its device is ‘like learning to touch type or play the piano’

The Royal Society estimated that such interfaces will be an “established option” for treating diseases like Alzheimer’s within two decades.

“People could become telepathic to some degree, able to converse not only without speaking but without words,” the report stated, while expanding on more futuristic applications like being able to virtually taste and smell without physically experiencing the sensation.

“Someone on holiday could beam a ‘neural postcard’ of what they are seeing, hearing or tasting into the mind of a friend back home.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in