Breaching the Last Bastion of the Human Psyche: Neural Data as Biometrics

Pete Haas
Associate Editor
Loyola University Chicago, School of Law, JD 2025

Earlier this year, the New York Times reported on the proposed Colorado Privacy Act and the impact it would have on neurotechnology which uses “neural data” and already has noteworthy support within programming communities. What the Colorado Privacy Act aims to address are not the labs and medical studies conducted within clinics, but how it may be used within a consumer context. The Colorado Privacy Act does more than Illinois’ pioneer Biometric Information Protection Act (BIPA).

Brief overview of biometric law

BIPA is the prominent law that established protection for consumers regarding their biometrics. Within the privacy space, there are few like it, though Colorado’s bill would be of a similar nature. BIPA established the scope of protected biometric identifiers: a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry. BIPA is explicit about what it protects, leaving little room for what is, and what is not, a biometric identifier. What BIPA does not consider are the expansions in terms of what biometrics are commercially available from a consumer.

The most commonly known biometric in use today is the fingerprint. As the need for obtaining identity assurance—for the benefit of a secure society—fingerprints and other biometrics have become regularly used by government organizations including the Department of Justice to identify arrestees, Department of Defense to identify combatants, and the Department of Homeland Security to identify immigrants who may be illegally entering the US. Those biometrics, however, may be obtained in any number of ways, including through third parties. Clearview AI is among the companies that employs facial recognition software to identify anyone in public. Clearview AI made its services available to private and public organizations, though it now is exclusively supporting law enforcement. Biometric information collected by companies like Clearview AI go well beyond the original collection use like banning people from sports venues.

Adding neural data to biometric identifiers

Where the Colorado Privacy Act improves on BIPA’s lead is that it adds neural data to the list of protected sensitive information. Biometric information is considered sensitive personal information from a privacy perspective. A person’s collective biometric identifiers are measurements for who that particular person is and is used as a unique identifier for that person—much more sensitive and unique than their Social Security Number, photo, or address (combined even). Because biometrics are closely tied to our identities, biometrics have been subject to certain privacy laws, BIPA and the Colorado Privacy Act included.

Neural data is, perhaps, the most sensitive and personal access that can be granted to the external world. Neural data is collected through consumer electronics and wearable technology by monitoring brainwave activities and correlating that activity to basic thoughts and emotions. When neural data is combined with accurate interpretations for what brain waves may translate toward, anyone may be able to gain access to how a person thinks, what they are thinking, and what propensities they may have toward any given topic. Colorado rightly decided this technology should be carefully employed to protect consumers. With the concerning history of facial recognition software to misidentify people, applying similar technology to determine what a person is thinking could have catastrophic consequences.

A concise case for neural data as biometrics

The privacy concerns with neural data are rife with the potential for missteps and abuse, the least of which stem from the history of legally recognized biometrics. Neurotechnology could provide a gateway for hefty legal implications, including erroneously founded ones. For instance, this new technology could register that innocent individuals have terrorist ideals, may pose a threat as a sexual offender, or may be homicidal. Alternatively, neurotechnology monitoring a person having legitimate concerns about their employer may be the trigger that leads to the termination of that employee. Data collected through neurotechnology must be protected to the same level, if not higher than, biometrics.

While the widespread usage of such neurotechnology would likely be focused around recommending the next video or product, we have seen otherwise. It could become the next avenue for law enforcement to promote and create a safer society, or for employers to “better understand their employees.” When considering the potential abuse of this technology and the sheer impact on privacy, neural data must receive the highest protections of the law.