Biometric surveillance: The erosion of human rights by data-generated identities

--

Joseph Shiovitz, Graduate Research Assistant, MPA Candidate in Public Policy, John Jay College of Criminal Justice

Image: Stockvault

Surveillance technologies are ubiquitous in daily life, in spaces both public and private, voluntarily, involuntarily, and sometimes unknowingly. But surveillance has evolved beyond security cameras recording video of subjects’ movements. Precise locations, political ideologies, spending habits, health information, and even fingerprints can be easily traced using the right software and algorithms.. A core issue around surveillance and human rights is the ethical issue of valuing and whose rights weigh greater in different situations, especially in criminal justice: such as the rights of individuals being in conflict with the rights of the state.

As noted by Dr. Matthias Wienroth in his presentation on surveillance-relevant human rights to the Center for International Human Rights at the Transatlantic Forum event on March 23, 2022, “a key concern around the consideration of human rights in surveillance is a lack of information about what personal data is being collected, where and for what purposes it may be used.”

Dr. Wienroth argues that data being collected about us now serves as the basis for a new type of human identity threatening to de-individualize each person, and subsequently, aspects of their human rights. Mass data collection often does not link specific datasets to any particular person, but rather, aggregates it to generate categories that form the basis of a risk-assessment score Any behavior or characteristic deemed out of the ordinary could raise an individual’s risk-assessment score, thus limiting their access to spaces and services throughout society. This aggregation of data diffuses the boundaries between various uses of data, meaning it’s collected for one purpose, but then used for something different, often without the subject’s awareness.

These new types of identity, called data doubles, “are not the individual itself, because they prioritize certain characteristics and ignore others…” He warns that certain body forms could emerge as normative, rendering others vulnerable to bias and discrimination, while treatment of the human body is reduced to a datafied object serving as a tool for decision-making.

The datafication of individuals forces us to rethink concepts of human rights protections, as suggested, especially in a recent case like the NYPD DNA database. In her response to Dr. Wienroth’s talk, Dr. Michelle Strah, Visiting Scholar at CIHR, noted that the NYPD DNA database case underscores the challenges around data protection and data standards in criminal justice surveillance data: how long can data be kept? who has access? what are the allowed uses and limits on use? She also brought up a recent case in San Francisco where victim DNA data provided for the purposes of sexual assault investigations was not only not used for this purpose, it was used to prosecute victims for other, unrelated crimes. Dr. Strah also contextualized the “datafication” of individuals within the context of Zuboff’s “surveillance capitalism.” She noted that data doubles are created by corporate actors that profit from aggregation of big data and operate as de facto criminologists by selling “certainty” to law enforcement, when the field is fundamentally about the uncertainties of “(un)knowing” as outlined by Dr. Wienroth.

The fervent desire to collect biometric data could succeed in establishing normative risk identities that diminish our individuality and obscure how determinations are made about our access to parts of society. In his discussion of ‘unknowing’, Dr. Wienroth asserts that strategic ignorance purposefully exploits gaps in knowledge to legitimize an inherent bias of data collection — that the need for more extensive data collection will always be present — while simultaneously ignoring certain aspects about an individual. This paradigm implies the inherent good of collecting and retaining personal data, despite being in conflict with the individuals’ human rights they are invoked to protect.

This talk is now available on YouTube https://youtu.be/5pSIgpFJBGE For more discussion of human rights, surveillance capitalism and the ethical challenges of biometrics, please see the CIHR AI Team blog on Medium https://jjccihr.medium.com/

--

--

Center for International Human Rights
Center for International Human Rights

Written by Center for International Human Rights

A research center at John Jay College focused on a critical examination of long-standing and emerging issues on the human rights agenda.

No responses yet