Cops and Facial Recognition: A Detriment To Society?

Introduction

Facial recognition is a technology that can be beneficial to various industries in certain circumstances. However, technology is only as good as the person who created it and ensures that it works properly. Facial recognition software can also have dire effects on some if given to a person with the wrong intentions. Issues with this technology can arise from user biases, the misidentification of people of color, and data inaccuracies. It also begs the question of how much data is okay to use and to what extent should we allow technology to have access to our data before it becomes a violation of privacy.

Source: Guardian/Getty Images

Police departments across the country have been utilizing facial recognition technology to compare suspects to images of people found across the internet and in databases. But what happens when detectives and police use facial recognition to solve a case and it results in a wrongful detainment? This is exactly what occurred when a DeKalb County man was arrested wrongfully by the Jefferson Parish Sheriff's Office.

Background

Randall Reid, 28, was incarcerated on November 25th when DeKalb officials booked him as a fugitive on a warrant for his arrest that was issued from the JPSO. The Louisiana officers used facial recognition technology to identify suspects of some high-end purse robberies in the area. The officers went directly off of the matches from the algorithm which resulted in an incorrect match and the wrongful detainment of the DeKalb resident. The link to the story from The Associated Press is linked here

He was in jail for over a week on a charge he knew nothing about from a place he had never been to before. He was released on December 1, when officers realized that they made a mistake and did not ensure that Reid matched the full description of the suspect in the robbery.

Jefferson Parish Sheriff Joe Lopinto
Source: Max Becherer/ The Times Picayune

Relevance

This situation ties directly into what we are learning about algorithms. The TED talk by Cathy O'Neil explains that algorithms are opinions embedded in code. We have learned that algorithms can be wrong and have negative consequences even though they are built with well-intent. Programs and software play into the creator's biases if they are not checked and programmed directly to avoid or minimize these effects. The same can be said for facial recognition technology, law enforcement cannot solely rely on the matches; they should conduct thorough investigations and be conscious of potential false positives matches and their subconscious biases.


As Ruha Benjamin stated, "Racism among other axes of domination helps to produce this fragmented imagination, misery for some, monopolies for other". This is evident in the way that private companies are selling their algorithms or facial recognition technology to governments for profit and have not considered the unfavorable effects that it can have on people. Law enforcement agencies purchase the software and embed biases into the program which further injustices people of color and women. Our society shapes technology and the way that we use it so users and creators must be conscious of their surroundings to prevent unintended consequences.

Significance

The two biggest problems that jump out to me are the lack of data privacy and inaccuracies for people of color. This is becoming an increasingly pressing issue due to the commonality for cops to rely solely on facial recognition as a means of solving criminal cases. Facial recognition technology is built around the premise of using captured images from databases or social media postings to identify an individual. However, this type of technology is a form of surveillance and without the explicit consent of the user or person, it is a violation of privacy. Another issue with facial recognition is that it has been proven to be highly inaccurate for women and people of color demonstrating an inherent bias. A researcher, Joy Buolamwini, from the Massachusetts Institute of Technology exposed the racial and gender biases in facial analysis software.


MIT Researcher Joy Buolamwini
Source: AP Photo/Steven Senne


This information is important because there is already such a tough legal climate among African Americans and cops currently due to the brutality that African Americans face at the hand of police. There is a lot that needs to be resolved in our legal and justice system alone without the added complexity of issues that could stem from law enforcement agencies having access to facial recognition technology. Furthermore, the technology is not reliable enough to be used in law enforcement agencies. We can learn from this situation by implementing laws and bans that prevent cops from using this technology or by safeguarding it and requiring consent from the accused before running a scan to test for a match.

Comments

Popular posts from this blog

All roads lead to public relations...

Social media platforms offering Telehealth services?