Facebook’s plan to tap into our brains, fact-checked by a neuroscientist

No, Facebook will not be able to listen in on your inner monologue.

Facebook wants a direct line into your brain.

As Recode reports, the company’s secretive R&D lab — known as Building 8 — is creating a “‘brain-computer speech-to-text interface,’ technology that’s supposed to translate your thoughts directly from your brain to a computer screen without any need for speech or fingertips.” It’s also interested in developing a “brain mouse,” where people could use their attention to move a cursor across the screen.

The intent behind these initiatives is to a) help people with disabilities use the internet, and b) pioneer new ways for consumers to interact with technologies.

After reading the reports, I had many questions. Mainly: Is this bogus? So I called up Rebecca Saxe, a neuroscientist at MIT. Her first reaction: “The things Facebook is saying are crazy, but the technological idea is not crazy.”

Let’s break it down.

What’s impossible about Facebook’s idea

Facebook says the technology to turn thoughts to text will only pick up on the words you want to type, by detecting “activity in a very specific region of the brain where thoughts are translated into speech,” as Recode explains.

That makes it sound like Facebook wants to listen in on your inner monologue and transcribe it for you. That’s not plausible.

First off, the technology would be limited by the type of brain scanners that are available. Saxe, along with two other neuroscientists I contacted, said if Facebook were going to market a consumer-friendly brain scanner today, it would go with an EEG (a cap with electrodes that picks on broad levels of brain activity).

Here’s the problem: EEG is a blunt instrument. It reads the overall levels of electrical activity in your brain. Doctors use it to see if you’re asleep or not. It’s not sensitive enough to pick up on the individual neurons that fire when you think, “Hey, guys! Some personal news!”

Saxe says you can get a bit more resolution if you implant electrodes directly into a person’s brain — “which is such infection risk, I can’t imagine they’re going to do it,” she says. (Neither can I.)

Saxe explains that there are a few patients who have electrodes implanted directly into the speech regions of the brain. And even with that access, no software can pick up on specific words people are thinking about. The technology has progressed to the point where it can recognize when a person is done reading a sentence on a computer screen. “And that’s not for a sentence you thought yourself, but for a sentence we showed you,” she says.

Facebook’s Regina Dugan, who announced all of this exciting news, said the technology would optimally use “optical imaging” of the brain — but didn’t elaborate on what that means. Saxe thinks she’s referring to something called “functional near-infrared spectroscopy” (NIRS), which is like an EEG in that it isn’t invasive. But it doesn’t pick up on electrical activity. Instead, it uses lights and infrared sensors to pick up on changes in blood flow in the brain. This is similar to how fMRIs work, but it yields a fuzzier picture.

There’s been some interesting work where neuroscientists have been able to reconstruct movie scenes or memories just from looking at fMRI data. But all that yields is rough images, and only after the participants have spent hours hooked up to the machine. “A possible technological innovation would be to make a NIRS system that has spatial resolution comparable to fMRI,” Saxe says. “Many groups have thought about this, but the technology hasn’t been developed.”

Perhaps Facebook could pour billions into the program and figure it out. It’s just not feasible yet.

So what is possible?

So current technology can’t pick individual words out of your brain. But it can pick up on where you’re directing your attention. And scientists can use that signal to build mice and keyboards you use only by thinking about them.

Here’s how this works, as Saxe explains it: Imagine a keyboard where each letter is illuminated by a light that flickers at its own individual beat.

So you look at this disco keyboard while connected to an EEG. “If you focus on a letter,” she says, “that tempo frequency is going to be more reflective in your visual cortex than any other.” And the EEG can pick up on that.

So if you look at the flashing A, the computer attached to the EEG will know you’re looking at A and can type it for you. Which means you could type with your visual attention alone.

This isn’t as easy as it sounds.

“If we had to devote all that mental attention to selecting one letter at a time, we would clearly be slower and less efficient than typing with our hands,” Saxe says. “It would take so much concentration that you would get exhausted,” at least initially, without training. Using predictive text software (like the one on a smartphone messaging app) could bump up the processing speed, but probably not to the 100 words per minute Facebook claiming is possible.

This potential brain-computer interface could be very helpful for those with paralysis or disabilities that make typing hard or impossible. Facebook’s Dugan said that “even something as simple as a ‘yes/no’ brain click or a ‘brain mouse’ would be transformative.” And that would be huge for people with near-complete paralysis.

But the mass-market appeal is unclear. And keep in mind all of this requires a device placed on your skull to work (though Saxe informs me “fashionable EEG caps — that’s totally a thing”).

So is Facebook going inside your skull? It’s plausible. But it might not be all that useful in practice.

Leave a Reply

Your email address will not be published. Required fields are marked *