Skip to main content

Wearable Brain Devices Will Challenge Our Mental Privacy

A new era of neurotechnology means we may need new protections to safeguard our brain and mental experiences

A man wearing a virtual reality headset with his hands outstretched in front of him.

A last bastion of privacy, our brains have remained inviolate, even as sensors now record our heartbeats, breaths, steps and sleep. All that is about to change. An avalanche of brain-tracking devices—earbuds, headphones, headbands, watches and even wearable tattoos—will soon enter the market, promising to transform our lives. And threatening to breach the refuge of our minds.

Tech titans Meta, Snap, Microsoft and Apple are already investing heavily in brain wearables. They aim to embed brain sensors into smart watches, earbuds, headsets and sleep aids. Integrating them into our everyday lives could revolutionize health care, enabling early diagnosis and personalized treatment of conditions such as depression, epilepsy and even cognitive decline. Brain sensors could improve our ability to meditate, focus and even communicate with a seamless technological telepathy—using the power of thoughts and emotion to drive our interaction with augmented reality (AR) and virtual reality (VR) headsets, or even type on virtual keyboards with our minds.

But brain wearables also pose very real risks to mental privacy, freedom of thought and self-determination. As these devices proliferate, they will generate vast amounts of neural data, creating an intimate window into our brain states, emotions and even memories. We need the individual power to shutter this new view into our inner selves.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Employers already seek out such data, tracking worker fatigue levels and offering brain wellness programs to mitigate stress, via platforms that give them unprecedented access to employees’ brains. Cognitive and emotional testing based on neuroscience is becoming a new job screening norm, revealing personality aspects that may have little to do with a job. In China, train conductors of the Beijing-Shanghai line, the busiest of its kind in the world, wear brain sensors throughout their work day. There are even reports of Chinese employees being sent home if their brain activity shows less than stellar brain metrics. As companies embrace brain wearables that can track employees’ attention, focus and even boredom, without safeguards in place, they could trample on employee’s mental privacy, eroding trust and well-being along with the dignity of work itself.

Governments, too, are seeking access to our brains, with a U.S brain initiative seeking “‘every spike from every neuron’ in the human brain,” to reveal “how the firing of these neurons produced complex thoughts.” While aimed at the underlying causes of neurological and psychiatric conditions, this same investment could also enable government interference with freedom of thought—a freedom critical to human flourishing. From functional brain biometric programs under development to authenticate individuals—including those funded by the National Science Foundation at Binghamton University—to so-called brain-fingerprinting techniques used to interrogate criminal suspects—sold by companies like Brainwave Science and funded by law enforcement agencies from Singapore to Australia to the United Arab Emirates—we must act quickly to ensure neurotechnology benefits humanity rather than heralding an Orwellian future of spying on our brains.

The rush to hack the human brain veers from neuromarketing to the rabbit hole of social media and even to cognitive warfare programs designed to disable or disorient. These technologies should have our full attention. Neuromarketing campaigns such as one conducted by Frito-Lays used insights about how women’s brains could affect snacking decisions, then monitored brain activity while people viewed their newly designed advertisements, allowing them to fine-tune their campaigns to better capture attention and drive women to snack more on their products. Social media “like” buttons and notifications are features designed to draw us habitually back to platforms, exploiting our brains’ reward systems. Clickbait headlines and pseudoscience claims prey on our cognitive biases, hobbling critical thinking. And nations worldwide are considering possible military applications of neuroscience, which some planners call warfare’s “sixth domain” (adding to a list that includes land, sea, air, space and cyberspace).

As brain wearables and artificial intelligences advance, the line between human agency and machine intervention will also blur. When a wearable reshapes our thoughts and emotions, how much of our actions and decisions remain truly our own? As we begin to offload mental tasks to AI, we risk becoming overly dependent on technology, weakening independent thought and even our capacity for reflective decision-making. Should we allow AI to shape our brains and mental experiences? And how do we retain our humanity in an increasingly interconnected world remade by these two technologies?

Malicious use and even hacking of brain wearables is another threat. From probing for information, to intercepting our PIN numbers as we think or type them, neural cybersecurity will rule. Imagine a world where brain wearables can track what we read and see, alter perceptions, manipulate emotions or even trigger physical pain. That’s a world that may soon arrive. Already, companies including China’s Entertech have accumulated millions of raw EEG data recordings from individuals across the world using its popular consumer-based brain wearables, along with personal information and device and app usage by those individuals. Entertech makes plain in their privacy policy they also record personal information, GPS signals, device sensors, computers and services a person is using, including websites they may be visiting. We must ensure that brain wearables are designed with security in mind and with device and data safeguards in place to mitigate these risks.

We stand at an inflection point in the beginning of a brain wearable revolution. We need prudent vigilance and an open and honest debate about the risks and benefits of neurotechnology, to ensure it is used responsibly and ethically. With the right safeguards, neurotechnology could be truly empowering for individuals. To get there will require we recognize new digital age rights to preserve our cognitive liberty—self-determination over our brains and mental experiences. We must do so now, before the choice is no longer ours to make.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.