Three Democratic senators have sounded the alarm over the unregulated practices of neurotech companies, warning that they are selling sensitive brain data without proper safeguards in place. In a letter to the Federal Trade Commission (FTC), Sens. Chuck Schumer (D-NY), Maria Cantwell (D-IN), and Ed Markey (D-MA) called for an investigation into the handling of user data by neurotech companies and urged for tighter regulations on their data-sharing policies.
The senators' concerns center around the ability of neurotech companies to collect and potentially sell neural data, which can reveal mental health conditions, emotional states, and cognitive patterns, even when anonymized. This information is not only deeply personal but also strategically sensitive, making it a prime target for exploitation. The letter cites a 2024 report by the Neurorights Foundation, which found that most neurotech companies have few safeguards on user data and can share sensitive information with third parties.
The report looked at the data policies of 30 consumer-facing brain-computer interface (BCI) companies and found that all but one appear to have access to users' neural data, with no meaningful limitations to this access. The companies surveyed make it difficult for users to opt out of having their neurological data shared with third parties, with just over half explicitly letting consumers revoke consent for data processing and only 14 of the 30 giving users the ability to delete their data.
The senators are calling on the FTC to investigate whether neurotech companies are engaging in unfair or deceptive practices that violate the FTC Act, compel companies to report on data handling and commercial practices, and clarify how existing privacy standards apply to neural data. They are also urging the FTC to begin a rulemaking process to establish safeguards for neural data and set limits on secondary uses like AI training and behavioral profiling.
While the concept of neural technologies may conjure up images of brain implants like Elon Musk's Neuralink, there are far less invasive and less regulated neurotech products on the market, including headsets that help people meditate, purportedly trigger lucid dreaming, and promise to help users with online dating. These consumer products gobble up insights about users' neurological data, and since they aren't categorized as medical devices, the companies behind them aren't barred from sharing that data with third parties.
Stephen Damianos, the executive director of the Neurorights Foundation, noted that there is a "very hazy gray area" between medical devices and wellness devices, with some products marketed for health and wellness purposes but not subject to the same regulations as medical devices. He emphasized the need for informed consent and transparency in the collection and use of neural data, citing the potential risks and benefits of these technologies.
The senators' warning comes as states like Colorado and California have begun to take steps to protect consumers' neural data. In April 2024, Colorado passed the first-ever legislation protecting consumers' neural data, and in September, California amended its Consumer Privacy Act to protect neural data. The move highlights the need for federal regulations to safeguard users' sensitive information and ensure that neurotech companies are held accountable for their practices.
The implications of unregulated neurotech practices are far-reaching, with potential consequences for individuals, businesses, and society as a whole. As neurotech companies continue to push the boundaries of what is possible with brain-computer interfaces, it is essential that policymakers and regulators take a proactive approach to protecting users' rights and preventing abuses of power.