A Brain Scanner Combined with an AI Language Model Can Provide a Glimpse into Your Thoughts

Allison Parshall writing for Scientific Americanican (Apple News)

Now researchers have taken a step forward by combining fMRI’s ability to monitor neural activity with the predictive power of artificial intelligence language models. The hybrid technology has resulted in a decoder that can reproduce, with a surprising level of accuracy, the stories that a person listened to or imagined telling in the scanner. The decoder could even guess the story behind a short film that someone watched in the scanner, though with less accuracy.

And

Here’s an example of what one study participant heard, as transcribed in the paper: “i got up from the air mattress and pressed my face against the glass of the bedroom window expecting to see eyes staring back at me but instead finding only darkness.” Inspecting the person’s brain scans, the model went on to decode, “i just continued to walk up to the window and open the glass i stood on my toes and peered out i didn’t see anything and looked up again i saw nothing.”

They talk about using this to talk to people who can’t outwardly communicate. This is just the beginning and it’s already 80-90% accurate!

This is wild 🤯