Computers could soon be able to decode your thoughts into actual speech or written words, without you even saying a word. This kind of technology sounds like science fiction, but there are a variety of ways scientists are edging towards making it a reality, according to a review
COMPUTER CAN TELL WHO YOU ARE THINKING ABOUT
Reading minds is an ability only found in comic book heroes.
But new research has revealed that computers can now analyse brain scans and work out who a person is thinking about.
The AI system can even create a digital portrait of the face in question.
Researchers at the Kuhl Lab at the University of Oregon used an innovative form of fMRI pattern analysis to test whether lateral parietal cortex actively represents the contents of memory.
Using a large set of human face images, the first extracted latent face components, known as eigenfaces.
Then machine learning algorithms were used to predict face components from fMRI activity patterns and reconstruct images of individual faces in digital portraits.
Computers that can read our minds might enhance the capabilities of already existing speech interfaces with devices, like Siri and Ok Google.
But it could be even more important for those with speech difficulties, and even more so for patients who lack any speech or motor function at all.
‘So instead of saying “Siri, what is the weather like today” or “Ok Google, where can I go for lunch?” I just imagine saying these things,’ said Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.
Reading someone’s thoughts might still belong to the realms of science fiction, but scientists are already decoding speech from signals generated in our brains when we speak or listen to someone talking.
In the new study, Mr Herff and co-author Dr Tanja Schultz, both from the Karlsruhe Institute of Technology, compared the pros and cons of using various brain imaging techniques to take neural signals from the brain and decode them to text.
There are a variety of technologies out there, the authors said, including functional MRI and near infrared imaging that detect neural signals based on the metabolic activity of neurons.
Another method can detect electromagnetic activity of neurons responding to speech.
But there was one method in particular, called electrocorticography, which stood out in the new review, the authors said.
This technique uses a brain-to-text system. demonstrated on epilepsy patients who already had electrode grids implanted for treatment of their condition.
The patients read out texts presented on a screen in front of them while their brain activity was recorded.
This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or ‘phones’.
When the researchers included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy.
There are a variety of technologies out there, the authors said, including functional MRI (stock image pictured) and near infrared imaging that detect neural signals based on the metabolic activity of neurons
‘For the first time, we could show that brain activity can be decoded specifically enough to use ASR (automated speech recognition) technology on brain signals,’ said Mr Herff.
‘However, the current need for implanted electrodes renders it far from usable in day-to-day life.’
To go from here to a functioning thought-detection device will still require some work.
‘A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that,’ said Herff.
Earlier this year researchers at the University of Rochester revealed a computer program that searches for the brain activity related to certain words and then use this to predict a sentence being thought, even it hasn’t seen it before.
They said the system is able to get the predictions right around 70 per cent of the time.
Dr Andrew Anderson, a research fellow at the University of Rochester who led the study, said the technology could be used to help people who have suffered from a stroke to communicate.
The researchers, whose study was published in the journal Cerebral Cortex, used brain scans taken with functional magnetic resonance imaging from 14 participants as they silently read 240 unique sentences.
In the new study, Mr Herff and co-author Dr Tanja Schultz, both from the Karlsruhe Institute of Technology, compared the pros and cons of using various brain imaging techniques to take neural signals from the brain and decode them to text…