brain
Image: Mariia Shalabaieva via Unsplash

As scientists show they can read inner speech, brain implant ‘pioneers’ fight for neural data privacy, access rights

It was an easy decision for J. Galen Buckwalter, a 69-year-old quadriplegic living in Southern California, to undergo a craniotomy in 2024. The operation — which involved inserting 384 electrodes in his brain and a large titanium plate in his skull — allows researchers to record data about how his neurons operate, potentially helping future paralysis patients.

The hard part, Buckwalter says, has been giving up the right to access and own his neural data and feel assured that it will be kept private.

Buckwalter has been a quadriplegic since diving into a river and hitting his head on a submerged rock a few days before his 17th birthday. When given the chance to participate in a California Institute of Technology (Caltech) study designed to decode how his neurons signal his hands to grasp objects, Buckwalter leapt at the opportunity to improve the lives of future “young, dumb kids like I was.”

As the researchers’ work studying his brain progressed, and they dug deeper into capturing how, where and when his individual neurons fire, Buckwalter began to think twice about the lack of specific protections outlined in the informed consent form he signed to take part in the study. He decided he wants the right to access and own his data and have more explicit guardrails in place to protect his privacy.

With scientists now demonstrating that they can decode attempted speech based on the neural data they collect from Brain Computer Interface (BCI) research subjects with implants, patients and advocates say the importance of adequate data protections has grown. They point to how Elon Musk’s Neuralink is promising a near-term future where tens of thousands of people each year receive implants to treat a variety of medical conditions or even to improve cognitive function.

BCIs, which can be implanted or worn, enable direct communication between the brain and external devices, allowing individuals to control the devices with their thoughts. Implanted BCIs extract far more sensitive data than wearables do, and in both cases third parties can access the neural data.

For Buckwalter, who has electrodes located in his prefrontal cortex — an area of the brain which plays a critical role in decision-making and inner speech and contributes to personality and emotional regulation — the data privacy questions are particularly fraught. Caltech devices are recording activity in five areas of his brain and he is the first person ever to receive an implant in the dorsolateral prefrontal cortex.

Buckwalter, who calls himself a “neuronaut,” is working with a group of other paralysis patients with BCI implants to design a set of ethical and legal norms which they would like to see implemented in future research studies and clinical trials. 

The group, which calls itself the BCI Pioneers Coalition, is now studying examples of informed consent agreements. They plan to launch a campaign advocating for agreements which offer more privacy and data access rights for study participants, and hope to submit a paper with proposed guidelines to a journal in the coming months.

“The appropriate framing should be that [researchers] have a piece of me there, of my individual person,” Buckwalter said. “Who owns my thoughts, fundamentally, is what we're talking about.”

Image: J. Galen Buckwalter

Image: J. Galen Buckwalter

Regulating brain exploration

Late last month Stanford researchers published a study showing that by using artificial intelligence they could decode the inner speech of four ALS and stroke patients from brain signals with a startling degree of accuracy. Word errors ranged from 26 to 54 percent for large vocabulary sets. Specific words could be decoded accurately about half of the time. 

Some implanted BCIs can now decode “unconstrained sentences” at about 60 words a minute in paralyzed users, according to Nita Farahany, author of The Battle for Your Brain: Defending Your Right to Think Freely in the Age of Neurotechnology. Farahany calls the development a “major milestone.”

Consumers using neurotech to meditate, for gaming or to improve sleep are in many states protected by comprehensive state data privacy laws safeguarding “personal data” and “sensitive data.” Four states have enacted laws specifically protecting consumers’ neural data.

State consumer data privacy laws do not cover the work done in research studies and clinical trials, which are expanding quickly as BCI research is growing to include potential treatments for blindness, Parkinson’s disease, depression and other ailments.

The well-known federal privacy statute HIPAA (the Health Insurance Portability and Accountability Act) only applies to certain information which is placed in an individual’s medical record. In some cases, neural data collected during research studies is not protected by HIPAA, according to Kristen Mathews, a lawyer specializing in neural data privacy at the law firm Cooley.

Informed consent forms which study participants sign generally detail how data are treated, but there is not a uniform standard for how they are written in the U.S. Existing clinical trial regulations in the U.S. punt most of the details about data rights to informed consent forms, Mathews said. 

Federally funded studies are covered by what’s known as the Common Rule, but when it comes to issues like data sharing and retention the rule defers decisions to institutional review boards and informed consent forms, according to Margot Hanley, director of research at Duke University’s Cognitive Futures Lab.

“The landscape for invasive BCI data governance is fairly fragmented and unclear,” Hanley said. “Participants’ privacy protections could feasibly come down to what is in the informed consent document itself, which is determined on a case-by-case basis by the university.”

BCI patients, including Buckwalter, say those agreements often do not give them enough disclosures about what researchers will do with their data and who they will share it with. They also typically bar patients from accessing their data so that study participants cannot share it with other scientists in the interest of advancing research into their conditions.

People deciding whether to participate in a study would benefit from knowing what kinds of data will be collected by their implant, what other information about them could be derived from it and who that data will be shared with, Mathews said.

Mathews is now working with the BCI Pioneers to establish the guidelines they are drafting, which they intend to use as a template for a consent form addendum for all studies involving brain implants. Buckwalter says he and the other BCI Pioneers want standards in place which will enumerate data privacy risks to participants and do as much as possible to minimize them.

“We've lost a lot of control of our private information over the years, and there's really only one thing that we still have, which is the sanctity of our innermost thoughts and feelings, the stuff that's in our brain,” Mathews said.

A ‘seat at the table’

Last month, Buckwalter met with the principal investigator for the Caltech study about adding more robust and specific protections to the latest informed consent form he has been asked to sign.

Caltech has “given themselves carte blanche in the way the current consent is structured and you're sort of forced to go along with that if you want to do what you're doing, which is very important,” he said.

The research director was open to his input, Buckwalter said, and the Caltech team is considering adding language which more specifically addresses how his neural privacy is protected and giving him access to his data. He currently can’t access it much less own it.

A spokesperson for Caltech said the university is “evaluating the issue and the concerns expressed.”

A longtime punk rocker, Buckwalter recently recorded a song with beats correlated to how rapidly his individual neurons fire. It could only be composed with the aid of one of the researchers tracking his brain data since Buckwalter has no access rights.

That limitation could stall Buckwalter’s planned partnership with a professor at Arizona State University who is taking his writings and feeding them into a large language model. The professor intends to create a virtual clone of Buckwalter which would engage with his implants in order to gain a window into how his consciousness works. To participate in that effort, Buckwalter needs the data Caltech has captured.

“I should be able to decide to engage with people who are going to advance science beyond academic silos,” Buckwalter said. “Some would argue I have a moral obligation to do so, an argument I agree with.”

Buckwalter admires the team he works with, he said, but was surprised by how little thought its leadership and the institutional review board had given to the nuance of the data privacy, access and ownership questions he raised. The situation underscored for him how vital the BCI Pioneers Coalition’s effort is, he said.

“The desire is to start addressing this issue for the broader BCI community because scaling is going to happen,” Buckwalter said, pointing to Elon Musk’s stated plans to implant 20,000 people a year with Neuralink chips by 2031. “If we don't deal with it now, it's not going to end well — this data literally can be used to understand me at my core.”

Neuralink is reportedly launching an October clinical trial to test a brain implant which can translate thought into text. The company said it wants to implant the device into a healthy subject by 2030. Meanwhile, Sam Altman, co-founder and CEO of OpenAI, is reportedly preparing to bankroll a BCI company which will directly compete with Neuralink.

Ian Burkhart, another BCI Pioneer who previously had electrodes inserted into his brain for a study, is working with Buckwalter and Mathews on the initiative.

“Ethical considerations are discussed [in consent agreements], but there's no one leading from the end user point of view,” he said, highlighting the fact that the BCI Pioneers want a “seat at the table.”

Buckwalter says participants often do not get enough information about everything a given study will probe. If researchers plan to analyze the brain data to better understand internal speech, he said, participants should know that because they may be uncomfortable with their neural information being read in such an intimate way.

‘The frontier is constantly advancing’

As researchers increasingly use artificial intelligence to augment their understanding of internal speech, Buckwalter finds himself thinking about how long his data will be stored and how future innovations will allow for more invasive analysis of what is happening inside his head now.

“With advances in AI, what we can decode from this data is expanding quickly… and the frontier is constantly advancing,” Hanley said, emphasizing that protections should account for future and unforeseeable risks. “Neural data collected today could, in ten years, be mined for information that is unimaginable now, like glimpses of a person’s stray thoughts.” 

De-identification measures that feel robust now also may not hold up as computational techniques advance, Hanley added.

The bottom line is that pioneering neural data research subjects must be brought into the conversation, Buckwalter said.

“We are the tip of the spear in a brand new field and we've taken on quite a bit of risk to do these studies,” he said. “We are the first people to know our neural data, which equates to ourselves, is out there basically stored on a server.”

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
Recorded Future
No previous article
No new articles
Suzanne Smalley

Suzanne Smalley

is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.