DEAR TECH COMPANY – WHOLE FOODS STORE SURVEILLANCE EDITION

Dear Whole Foods security and surveillance team.

How do you use the thousands of Whole Foods customer photos that you capture and store in your security surveillance database? Who do you sell that information to? Who do you share that information with?

How many women are misclassified in your surveillance database as men?

How many individuals are misclassified in your surveillance database as having a criminal history? What is the measurement error of that?

How many individuals are misclassified in your surveillance database as undocumented?

How many individuals are misclassified in your surveillance database as homeless?

How many individuals are misclassified in your surveillance database as suspicious?

How many women have their home address, their work address, their vehicle make and model, their median income, their marriage status, their occupation pop-up on your surveillance screen the very second that they walk into a Whole Foods Market to buy lentils?

How many women are followed by an undercover loss prevention officer while shopping at Whole Foods because your little surveillance screen misclassifies them as having a predisposition to criminality?

How many individuals, on average, does your biased surveillance AI racially profile?

Since when did buying groceries become so f****ng taxing?




THE WEAPONIZATION OF FACIAL RECOGNITION SOFTWARE – WHOLE FOODS RETAIL EDITION

Ever wondered why they’re following YOU specifically?

According to a recent article published by the New York Times, MIT research has shown that facial recognition software is not coded to correctly identify and classify human beings of all genders, skin tones, and facial structures without demonstrating a biased output that benefits mostly White males. Though some facial recognition software systems claim to have the technology to detect criminality in the faces of public citizens, Amazon’s face surveillance technology Rekognition in fact falsely identified 28 members of Congress as criminals in a user test conducted by the American Civil Liberties Union.

Rekogition software has a 32% error rate in analyzing the faces of women or color

Despite the data-science proof of gender and racial bias in Amazon’s retail surveillance software, women and men of all races and backgrounds continue to be surveilled, monitored, and their images stored in Amazon’s facial recognition database for private use and/or possible misuse. Unregulated and unchecked by any agency.

To read more about Amazon’s commercial facial analysis system Rekogntion, and how it may be affecting you, your family, and your community, please take the time to read any of the below sources about the algorithmically race based bias in surveillance software:

Welcome To The Era Of Facial Recognition Technology: Changing The Game On How Minorities Are Treated

Racial And Gender Bias in Amazon Rekognition – Commerical AI Systems For Analyzing Faces

TED Talk: How I’m Fighting Bias In Algorithms

ACLU: Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Facial Recognition Fail: Amazon Project Mistakes Lawmakers With Suspects

Study Finds Racial Bias in Amazon’s Facial Recognition Tech