Mind-reading made possible with new AI technology

geralt+%7C+Pixabay

geralt | Pixabay

Rachel Guilbaud

Artificial intelligence continues to generate buzz as more strides are made to emulate human activity.

AI works through neural networks where computers learn to operate similarly to how the human brain does. New breakthroughs have been made imitating human dialogue through the likes of ChatGPT, Bard AI and other chatbots. However, a recent study suggests the decoding capabilities of AI extend past speech and into the personal thoughts of humans.

In a study published in Nature Neuroscience, a team of researchers from the University of Texas utilized their AI decoder to analyze brain scans. They utilized functional magnetic resonance imaging – fMRI – scans which measure blood flow to different areas of the brain. From there, the AI decoder analyzed the brain signals from the scans and interpreted the meaning of the “perceived and imagined stimuli.”

The study focused on three participants who went through the scans for several hours as they either listened to storytelling podcasts or watched silent movies.

The decoding was successful, with some setbacks. Neuroscientist Alexander Huth, who led the study at University at Texas, suggests there are limitations to the accuracy of this method.

Huth told The Guardian that “[the] system works at the level of ideas, semantics, meaning. This is the reason why what we get out is not the exact words, it’s the gist.”

Additionally, fMRI scans are costly, large and require extensive time needed to train the decoder to individual brains. Huth’s team plans to utilize more portable brain-imaging systems in the future.

This is not the first attempt at mind-reading through AI algorithms. Previously surgical implants were placed in the brain to track thought activity, but this study utilized one of the first non-invasive methods of language decoding.

This technology can potentially help patients who are unable to speak or experiencing paralysis be understood.

Furthermore, in 2017 researchers at Purdue University published a similar study in the journal Cerebral Cortex. Their study consisted of three participants, fMRI scans and deep learning – a subset of artificial intelligence.

Instead of imagined speech, the researchers studied the images perceived by participants. They were similarly able to identify visual stimuli using AI.

These new findings beg the question of mental privacy, and whether thoughts can be readily accessed.

The team of researchers at the University of Texas assured the non-invasive nature of the decoder both physically and mentally, stating “subject cooperation is required both to train and to apply the decoder.”

Voluntary cooperation is necessary for the decoder to be successful as each decoding is unique to each brain.

As of now, this technology for “mind-reading” is not entirely accurate and certainly has its limitations, but the main concept is present and palpable. This method lends itself to the progression of AI and its hand in expanding the possibilities of cognitive science in technology.