Data Privacy
,
Data Security
,
HIPAA/HITECH
MIND Act Asks FTC to Study Exploitation Risks for Neural Data Collected by Devices

Are brain waves and similar neural data the next frontier in consumer privacy worries? A trio of U.S. Senate Democrats has introduced federal legislation aiming to get ahead of potential threats and risks that brain-related data could be collected and misused by tech firms, data brokers, government agencies and others.
See Also: Using the Netskope HIPAA Mapping Guide
The Management of Individuals’ Neural Data Act of 2025 – or the MIND Act – was unveiled Wednesday by Senate Democratic Leader Chuck Schumer, D-NY, along with co-sponsors Sen. Maria Cantwell, D-Wash., who is ranking member of the Senate Commerce, Science and Transportation Committee, and Sen. Ed Markey, D-Mass.
The MIND Act, among other provisions, calls for the Federal Trade Commission to study how neural data – and related data such as biometric and behavioral information – should be safeguarded to protect privacy and prevent exploitation as neurotechnology rapidly advances.
The senators contend that a growing number of consumer wearables and devices “are quietly harvesting sensitive brain-related data with virtually no oversight and no limits on how it can be used.”
Neural data, such as brain waves or signals from neural implants can potentially reveal thoughts, emotions or decisions-making patterns that could be collected and used by third parties, such as data brokers, to manipulate consumers and even potentially threaten national security, the senators said.
“Our mental states, thoughts and feelings are often described as the ‘final frontier’ in privacy,” said Eric Null, co-director of the privacy and data program at advocacy group Center for Democracy & Technology. “And attempts to get at these mental states to monetize and otherwise make use of that data should not operate in a metaphorical ‘Wild West’ where anything goes.”
Companies are increasingly seeking to collect and use this data for a variety of purposes, including predicting and influencing human behavior, the senators said.
Null said interest is growing in products that track peoples’ mental states, including “claims around detecting emotional states.” While neural data collected in a clinical environment may be covered by HIPAA, data collected from general consumer products is not protected under HIPAA and is covered only under the FTC’s general unfair and deceptive practices authority, Null said.
“Non-HIPAA neural data needs protecting to ensure that companies do not use such data to take advantage of or harm people, and that the data is used ethically and is secured properly, and that data practices benefit as much as possible the person whose mental states are being tracked,” he said.
‘Good Start’
The MIND Act is “a good start” at recognizing the potential risks and harms that can come from use of neurotechnology, said Calli Schroeder, senior counsel and global privacy counsel at the Electronic Privacy Information Center, a privacy advocacy group.
“However, the act itself is less a regulation and more a call for more research: It mandates that a report and guidelines on neurotech be issued and points to areas to be included and considered in those documents, but does not actually produce hardline requirements,” she said.
“In a way, this kicks the regulatory can down the road rather than issuing baseline protections, allowing neurotechnology to continue expanding without individual protections in place in the meantime,” she said. “It is hard to substantively evaluate what is essentially a prompt for future documents.”
State Efforts
A handful of U.S. states have already enacted laws related to the protection of neural data. The first was Colorado in 2024, with California, Connecticut and Montana enacting their own laws shortly thereafter.
“Technology is advancing at a significant rate and it is prudent for Congress to stay on the forefront instead of lagging behind,” said regulatory attorney Rachel Rose. “The fact that states are already passing these laws is also a sign that Congress should be considering legislation.”
Colorado defines neural data “as information that is generated by the measurement of the activity of an individual’s central or peripheral nervous systems and that can be processed by or with the assistance of a device,'” Rose said.
Neural data is a subcategory of “biological data,” which Colorado defines as “data generated by the technological processing, measurement, or analysis of an individual’s biological, genetic, biochemical, physiological, or neural properties, compositions, or activities or of an individual’s body or bodily functions, which data is used or intended to be used, singly or in combination with other personal data, for identification purposes,” she said.
“Both the proposed MIND Act of 2025 and Colorado neural data and biometric data laws aim to prevent privacy violations and obtain adequate consent,” Rose said.
“The technologies that are utilized to track data may not be familiar to many but they do tie back into the definitions of neural data, biometric data, neurotechnology and other related data, which are found in either the Colorado law or the MIND Act of 2025,” she said.
In the meantime, some other nations have also started work to protect neural data, Schroeder said. “The Chilean Constitution has already been amended to include a specific right to neuroprotection, Article 3 of the EU Charter of Fundamental Rights sets out the right to respect an individual’s mental integrity, and proposals to both Brazil and Spain’s laws to address neural data protections have also been submitted,” she said. “Legislative action here falls into a global trend of protection from exploitation.”
Building Guardrails
The MIND Act aims to builds guardrails that protect privacy and consent involving neural data, the senators said in a statement.
“As technology continues to develop at a rapid pace, it is vital that we balance innovation with safety and privacy,” Schumer said. “For example, data about a person’s brain activity could be misused to push manipulative ads or high-risk financial schemes designed to take advantage of consumers at their most vulnerable moments,” he said.
The proposed legislation defines neurotechnology “as a device, system, or procedure that accesses, monitors, records, analyzes, predicts, stimulates or alters the nervous system of an individual to understand, influence, restore, or anticipate the structure, activity or function of the nervous system.”
Neural data is defined in the proposal as “information obtained by measuring the activity of an individual’s central or peripheral nervous system through the use of neurotechnology.”
Among companies currently working on neurotechnology developments is Neuralink, a firm owned by Elon Musk.
Neuralink is currently in clinical trials for an implantable, wireless brain device designed to interpret a person’s neural activity. The device is designed to help patients operate a computer or smartphone “by simply intending to move – no wires or physical movement are required.” Neuralink said the study is aimed at helping people with quadriplegia control external devices with their thoughts.
Neuralink did not immediately respond to Information Security Media Group’s request for comment on the senators’ proposals.
“We must ensure that Americans know how this data is being collected and used, that their consent matters and that strong guardrails are in place so innovation serves people – not exploits them,” Schumer said in the statement.
Null said that the introduction of the MIND Act – and the push for the FTC to examine neural data privacy issues – is an important first step on a national level in the United States.
“Studying an issue and coming up with recommendations around that issue is always important and it is good to increase Congress’ and the public’s understanding of an issue, especially a complicated one like neural privacy,” Null said. “We are glad to see folks in Congress paying attention.”
The FTC did not immediately respond to ISMG’s request for comment on the MIND Act proposal for the agency to study neural data privacy related issues.
