Surgically implanted devices that allow paralyzed people to speak can also eavesdrop on their inner monologue.
That's the conclusion of a study of brain-computer interfaces (BCIs) in the journal Cell.
The finding could lead to BCIs that allow paralyzed users to produce synthesized speech more quickly and with less effort.
But the idea that new technology can decode a person's inner voice is "unsettling," says Nita Farahany, a professor of law and philosophy at Duke University and author of the book: The Battle for Your Brain.
"The more we push this research forward, the more transparent our brains become," Farahany says, adding that measures to protect people's mental privacy are lagging behind technology that decodes signals in the brain.
From brain signal to speech
BCI's are able to decode speech using tiny electrode arrays that monitor activity in the brain's motor cortex, which controls the muscles involved in speaking. Until now, those devices have relied on signals produced when a paralyzed person is actively trying to speak a word or sentence.
"We're recording the signals as they're attempting to speak and translating those neural signals into the words that they're trying to say," says Erin Kunz, a postdoctoral researcher at Stanford University's Neural Prosthetics Translational Laboratory.
Relying on signals produced when a paralyzed person attempts speech makes it easy for that person to mentally zip their lip and avoid oversharing. But it also means they have to make a concerted effort to convey a word or sentence, which can be tiring and time consuming.
So Kunz and a team of scientists set out to find a better way — by studying the brain signals from four people who were already using BCIs to communicate.
The team wanted to know whether they could decode brain signals that are far more subtle than those produced by attempted speech. The team wanted to decode imagined speech.
During attempted speech, a paralyzed person is doing their best to to physically produce understandable spoken words, even though they no longer can. In imagined or inner speech, the individual merely thinks about a word or sentence — perhaps by imagining what it would sound like.
The team found that imagined speech produces signals in the motor cortex that are similar to, but fainter than, those of attempted speech. And with help from artificial intelligence, they were able to translate those fainter signals into words.
"We were able to get up to a 74% accuracy decoding sentences from a 125,000-word vocabulary," Kunz says.
Decoding a person's inner speech made communication faster and easier for the participants. But Kunz says the success raised an uncomfortable question: "If inner speech is similar enough to attempted speech, could it unintentionally leak out when someone is using a BCI?"
Their research suggested it could, in certain circumstances, like when a person was silently recalling a sequence of directions.
Password protection?
So the team tried two strategies to protect BCI users' privacy.
First, they programmed the device to ignore inner speech signals. That worked, but took away the speed and ease associated with decoding inner speech.
So Kunz says the team borrowed an approach used by virtual assistants like Alexa and Siri, which wake up only when they hear a specific phrase.
"We picked Chitty Chitty Bang Bang, because it doesn't occur too frequently in conversations and it's highly identifiable," Kunz says.
That allowed participants to control when their inner speech could be decoded.
But the safeguards tried in the study "assume that we can control our thinking in ways that may not actually match how our minds work," Farahany says.
For example, Farahany says, participants in the study couldn't prevent the BCI from decoding the numbers they were thinking about, even though they did not intend to share them.
That suggests "the boundary between public and private thought may be blurrier than we assume," Farahany says.
Privacy concerns are less of an issue with surgically implanted BCIs, which are well understood by users and will be regulated by the Food and Drug Administration when they reach the market. But that sort of education and regulation may not extend to upcoming consumer BCIs, which will probably be worn as caps and used for activities like playing video games.
Early consumer devices won't be sensitive enough to detect words the way implanted devices do, Farahany says. But the new study suggests that capability could be added someday.
If so, Farahany says, companies like Apple, Amazon, Google and Meta might be able to find out what's going on in a consumer's mind, even if that person doesn't intend to share the information.
"We have to recognize that this new era of brain transparency really is an entirely new frontier for us," Farahany says.
But it's encouraging, she says, that scientists are already thinking about ways to help people keep their private thoughts private.
Copyright 2025 NPR