Forget typing: Computers that can read your MIND and convert your thoughts into text are on their way

  • Computers could soon be able to decode your thoughts into actual speech
  • This is according to a review article, which looked at different techniques
  • Methods include fMRI and near infrared imaging that detect neural signals
  • But one method, called electrocorticography, seems the most promising

Voice-recognition technology can be useful for dictating text when you have no way of typing.

But computers could soon be able to decode your thoughts into actual speech or written words, without you even saying a word.

This kind of technology sounds like science fiction, but there are a variety of ways scientists are edging towards making it a reality, according to a new review.

Computers could soon be able to decode your thoughts into actual speech or written words, without you even saying a word. This kind of technology sounds like science fiction, but there are a variety of ways scientists are edging towards making it a reality, according to a review

Computers could soon be able to decode your thoughts into actual speech or written words, without you even saying a word. This kind of technology sounds like science fiction, but there are a variety of ways scientists are edging towards making it a reality, according to a review

COMPUTER CAN TELL WHO YOU ARE THINKING ABOUT 

Reading minds is an ability only found in comic book heroes.

But new research has revealed that computers can now analyse brain scans and work out who a person is thinking about. 

The AI system can even create a digital portrait of the face in question.

Researchers at the Kuhl Lab at the University of Oregon used an innovative form of fMRI pattern analysis to test whether lateral parietal cortex actively represents the contents of memory. 

Using a large set of human face images, the first extracted latent face components, known as eigenfaces. 

Then machine learning algorithms were used to predict face components from fMRI activity patterns and reconstruct images of individual faces in digital portraits.   

Advertisement

Computers that can read our minds might enhance the capabilities of already existing speech interfaces with devices, like Siri and Ok Google.

But it could be even more important for those with speech difficulties, and even more so for patients who lack any speech or motor function at all.

'So instead of saying "Siri, what is the weather like today" or "Ok Google, where can I go for lunch?" I just imagine saying these things,' said Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.

Reading someone's thoughts might still belong to the realms of science fiction, but scientists are already decoding speech from signals generated in our brains when we speak or listen to someone talking.

In the new study, Mr Herff and co-author Dr Tanja Schultz, both from the Karlsruhe Institute of Technology, compared the pros and cons of using various brain imaging techniques to take neural signals from the brain and decode them to text.

There are a variety of technologies out there, the authors said, including functional MRI and near infrared imaging that detect neural signals based on the metabolic activity of neurons.

Another method can detect electromagnetic activity of neurons responding to speech. 

But there was one method in particular, called electrocorticography, which stood out in the new review, the authors said.

This technique uses a brain-to-text system. demonstrated on epilepsy patients who already had electrode grids implanted for treatment of their condition. 

The patients read out texts presented on a screen in front of them while their brain activity was recorded.

This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or 'phones'.

When the researchers included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy.

There are a variety of technologies out there, the authors said, including functional MRI (stock image pictured) and near infrared imaging that detect neural signals based on the metabolic activity of neurons

There are a variety of technologies out there, the authors said, including functional MRI (stock image pictured) and near infrared imaging that detect neural signals based on the metabolic activity of neurons

COULD HACKERS GET INSIDE YOUR BRAIN? 

Experts at the University of Washington have revealed how hackers could inserting images into dodgy apps and recording our brain's unintentional reaction using brain-computer interfaces.

For example, when playing a video game users may see logos of familiar brands pop-up on the screen that just vanish.

Hackers put those images in the game and they were recording your 'brain's unintentional response to them' using a BCI, which can be a wearable device that monitors stress levels or a cap covered in electrodes.

This technology could one day be used by advertisers to gather more information about their customers.

Also, police officers and government officials could use this method to convict criminals or as a 'remote lie detector test'.  

Advertisement

'For the first time, we could show that brain activity can be decoded specifically enough to use ASR (automated speech recognition) technology on brain signals,' said Mr Herff.

'However, the current need for implanted electrodes renders it far from usable in day-to-day life.'

To go from here to a functioning thought-detection device will still require some work.

'A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that,' said Herff. 

Earlier this year researchers at the University of Rochester revealed a computer program that searches for the brain activity related to certain words and then use this to predict a sentence being thought, even it hasn't seen it before. 

They said the system is able to get the predictions right around 70 per cent of the time.

Dr Andrew Anderson, a research fellow at the University of Rochester who led the study, said the technology could be used to help people who have suffered from a stroke to communicate.

The researchers, whose study was published in the journal Cerebral Cortex, used brain scans taken with functional magnetic resonance imaging from 14 participants as they silently read 240 unique sentences.

In the new study, Mr Herff and co-author Dr Tanja Schultz, both from the Karlsruhe Institute of Technology, compared the pros and cons of using various brain imaging techniques to take neural signals from the brain and decode them to text

In the new study, Mr Herff and co-author Dr Tanja Schultz, both from the Karlsruhe Institute of Technology, compared the pros and cons of using various brain imaging techniques to take neural signals from the brain and decode them to text

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

We are no longer accepting comments on this article.