Brain Data Leak Out: How Consumer Neurotech Puts Mental Privacy at Risk
A few weeks ago, I read a headline that sounded like a new episode of Black Mirror: “Brain data of Leclerc leaked – possible misuse for military research.”
At first, I thought it was clickbait. But the story was real, an investigative report revealed that a neurotechnology company had collected EEG data from professional athletes, some of whom allegedly had connections to state and military institutions (Nardi, 2025). What concerned me most was that the leak came from a consumer EEG-based neurofeedback headband; a type of device similar to those I use in my own research. This technology has been developed in academic research but is now deployed commercially with far less oversight. In my research, we develop and test neurofeedback to strengthen cognitive functions — helping people to train their brain networks involved in executive functions such as working memory updating and cognitive flexibility (Smit et al., 2023). The same principle is currently being sold directly to consumers.
This contrast between carefully regulated academic research and rapidly expanding consumer neurotechnology reveals deeper ethical problems. In this post, I explore how commercial EEG devices are being used, why they raise new forms of surveillance and data risks, and why neuroliteracy is becoming essential for protecting our mental privacy.
How commercial EEG headbands are used
Wearable EEG headbands promise to track mental states, optimize performance, and even to improve happiness. The global neurotechnology market already exceeds USD 15 billion and is expected to triple by 2034 (Expert Market Research, 2025). Part of this boom is the rise of direct-to-consumer (DTC) neurotechnologies — devices marketed for wellness or performance that blur the line with medical usage and slip past medical-device regulations.
One such DTC device, the FocusCalm, provides users with a “focus score” and visual feedback to train attention and relaxation. At first, it seems harmless self-improvement. But, the same product —mentioned in the Leclerc headline — was at the centre of the data leak investigation. The company behind it, long branded as a Harvard-founded start up, was funded by Chinese state-linked investors, including sanctioned defence contractors. It may have collected EEG data from schoolchildren and elite athletes, stored on cloud servers abroad. Whether these allegations are true or not, the case highlights a fragile boundary: what begins as self-improvement can easily become data extraction.
Direct-to-consumer neurotechnologies marketed for wellness or performance blur the line with medical usage and slip past medical-device regulations. What begins as self-improvement can easily become data extraction.
Yet, data extraction is only one side of the problem. Other examples highlight a different concern: neurotechnology used for work monitoring and control. In China’s high-speed rail system, train drivers reportedly wear EEG headbands to track attention and fatigue (Fontanillo Lopez et al., 2020, People’s Daily Online, 2018). At first, this seems beneficial — preventing accidents and saving lives. Yet reports suggest that wearing these devices is mandatory, and that workers who appear “unfocused” based on brain data can face disciplinary measures or dismissal. Such practices mark a shift from assistive technology to cognitive surveillance, in which employees’ inner states become objects of continuous monitoring and managerial control.
Ironically many consumer EEG headbands do not even measure clean brain signals: their sensors often pick up muscle activity or are placed wrongly, far from brain regions they claim to monitor. This raises a troubling possibility – that disciplinary actions may not be based on the actual mental state of employees, but on unclear, ambiguous, or misinterpreted data.
Since consumer EEG headbands do not at all times measure clean brain signals, disciplinary actions based on cognitive surveillance may not be based on the actual mental state of employees but on unclear, ambiguous, or misinterpreted data.
The regulatory paradox
Neurotechnology has always raised ethical issues, but these issues become sharper outside well-regulated academic research. In academic research, every study must undergo ethics review and comply with GDPR as well as meet the highest methodological standards. By contrast, many commercial EEG devices fall outside medical regulation and are not subject to independent ethical review, leaving users to consent to the collection of sensitive neural data through not more than a click on “I agree.” Mental privacy is therefore of great concern: EEG data, can in principle reveal patterns linked to attention, fatigue, or pathology, despite the measurement limitations of such wearables with everyday use. That is the regulatory paradox: strict oversight in academia, near absence in commerce.
Academics impose the highest standards in low-risk research, while the commercial sphere – where neural and biometric data are harvested at a large scale – operates largely unregulated.
This imbalance becomes even more consequential as these systems nowadays also begin to incorporate artificial intelligence (AI), e.g. wearable data can be uploaded for analysis in ChatGPT Health. The listed ethical problems: surveillance, data extraction, and loss of control — already arise without automation. Most of the examples discussed here do not yet rely on AI. When AI is added, these risks are even more amplified, scaling up monitoring, magnifying privacy risks, and shifting decisions to opaque technical infrastructures. This is why we need neuroliteracy, a shared public understanding of what brain data are, what can and cannot be inferred from them, and how they may be sued.
How neuroliteracy can protect our cognitive liberty
Cognitive liberty is the right of self-determination over our own minds — to keep our thoughts, emotions, and neural signatures from being mined without consent – Prof. Nita Farahany.
The legal scholar and neuroethicist Nita Farahany termed this key issue “cognitive liberty”: the right of self-determination over our own minds — to keep our thoughts, emotions, and neural signatures from being mined without consent. Neuroliteracy begins with asking simple but fundamental questions: Who owns our neural data? Where does it go? What can companies infer from it? To foster such awareness, we created the YouTube clip “Science or Science Fiction” (Dabor & Enriquez-Geppert, 2019) using examples like commercial EEG headbands, Neil Harbisson and Neuralink to highlight gaps between regulated research and unregulated commercial neurotech.
At a recent NeuroAI workshop, my colleague Dr. Lorena Flórez Rojas asked whether brain data should be treated differently from other health data- raising the point of neuro-exceptionalism. She argues that instead of creating new rights, we should reinterpret existing ones, such as freedom of thought, for the neurotechnological era, with stronger data rules, independent oversight, and the ability to recall products or fine companies when they cross the line (Rojas, 2022). Some countries already acted on this: Chile and Mexico amended their constitutions to protect mental integrity and neurorights. These debates are therefore no longer abstract. The UN Office on Drugs and Crime and Interpol is currently drafting guidelines for the use of neurotechnology in law enforcement — a sign that neural privacy and cognitive rights have already entered international policy-making (United Nations, 2025).
Debates on cognitive liberty are no longer abstract: Chile and Mexico amended their constitutions to protect mental integrity and neurorights, and United Nations is following.
Ultimately, the same wearable devices that promise to help us “focus better” — like the FocusCalm headset — already collect highly personal brain data. Cognitive liberty is not an abstract concept; it is about technologies we use every day and whether they truly serve us as individuals.
The question is no longer if our brain and health data are being collected, but whether we understand what that means. Thus, neuroliteracy is the first step toward protecting our mental privacy and ensuring that neurotechnology remains a tool for empowerment, not exploitation.
References (asterisk* indicates non-scientific sources)
Dapor C., Enriquez-Geppert Lab. (2019). Science or science fiction? Brain-computer interfaces in applied neuroscience, avant-garde and advertisements- a documentary [YouTube video]. YouTube. https://www.youtube.com/watch?v=vHBiXsyblgQ
*Expert Market Research (2025, July). Neurotechnology market growth analysis – Forecast trends and outlook (2025-2034). https://www.researchandmarkets.com/reports/6163334/neurotechnology-market-growth-analysis#src-pos-1
*Farahany, N. A. (2023). The battle for your brain: defending the right to think freely in the age of neurotechnology. St. Martin’s Press.
Fontanillo Lopez, C. A., Li, G., & Zhang, D. (2020). Beyond technologies of electroencephalography-based brain-computer interfaces: A systematic review from commercial and ethical aspects. Frontiers in Neuroscience, 14, 611130.
*Nardi, W., Patel, D., Spendley, B., MacDonald, A., & Koppelman, S. (2025, September 16). BrainCo: The “Harvard” startup that became a “Little Dragon” in China — with brain data from U.S. Olympians and schoolchildren. Hunterbrook Media. https://hntrbrk.com/brainco/
*People’s Daily Online. (2018, May 8). New tech allows checks on workers’ state of mind when under pressure. https://en.people.cn/n3/2018/0508/c90000-9457379.html
Rojas, M. L. F. (2022). Neuromarketing vs libertad y autonomía de las decisiones del consumidor. Revista Brasileira de Direitos Fundamentais & Justiça, 16(1), 55-86.
Smit, D., Dapor, C., Koerts, J., Tucha, O. M., Huster, R. J., & Enriquez-Geppert, S. (2023). Long-term improvements in executive functions after frontal-midline theta neurofeedback in a (sub) clinical group. Frontiers in Human Neuroscience, 17, 1163380.
*United Nations. (2025, October). Background paper: Neurotechnology, law enforcement, and criminal justice — uses, risks, and human rights safeguards. United Nations.
————
Featuring image was dowloaded from Mindfield Biosystems and is free to use.



