THE WEAPONIZATION OF FACIAL RECOGNITION SOFTWARE – WHOLE FOODS RETAIL EDITION

Ever wondered why they’re following YOU specifically?

According to a recent article published by the New York Times, MIT research has shown that facial recognition software is not coded to correctly identify and classify human beings of all genders, skin tones, and facial structures without demonstrating a biased output that benefits mostly White males. Though some facial recognition software systems claim to have the technology to detect criminality in the faces of public citizens, Amazon’s face surveillance technology Rekognition in fact falsely identified 28 members of Congress as criminals in a user test conducted by the American Civil Liberties Union.

Rekogition software has a 32% error rate in analyzing the faces of women or color

Despite the data-science proof of gender and racial bias in Amazon’s retail surveillance software, women and men of all races and backgrounds continue to be surveilled, monitored, and their images stored in Amazon’s facial recognition database for private use and/or possible misuse. Unregulated and unchecked by any agency.

To read more about Amazon’s commercial facial analysis system Rekogntion, and how it may be affecting you, your family, and your community, please take the time to read any of the below sources about the algorithmically race based bias in surveillance software:

Welcome To The Era Of Facial Recognition Technology: Changing The Game On How Minorities Are Treated

Racial And Gender Bias in Amazon Rekognition – Commerical AI Systems For Analyzing Faces

TED Talk: How I’m Fighting Bias In Algorithms

ACLU: Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Facial Recognition Fail: Amazon Project Mistakes Lawmakers With Suspects

Study Finds Racial Bias in Amazon’s Facial Recognition Tech