WHOLE FOODS AND THEIR HATE CRIME HOOLIGANS – RACISM 365

Minorities and immigrants apparently do not get the benefit of respect as Whole Foods Market shoppers, as there are hundreds of consumer complaints of routine race-based targeting, surveilling, shadowing, and open suspicion. Whole Foods’ inherent hostility toward people of color spending their hard earned money purchasing asparagus water should ideally be assessed and noted for what it is: a system created to discourage minorities from patronizing Whole Foods locations where they are clearly not welcome, clearly not valued, and openly not respected.

This is what racism looks like.

A New York man was shopping at a Whole Foods location when, unprovoked, the loss prevention security attacked him with a homophobic slur, assaulting him to unconsciousness with the chokehold maneuver.
He was shopping for grapes.

HOW TO EMOTIONALLY DRAIN YOUR CUSTOMERS: WHOLE FOODS MARKET GUIDE

Whole Foods Market Guide To The Mistreatment Of The Marginalized

How to mistreat your unvalued customer

Spot brown customer entering Whole Foods Market on surveillance video

Immediately stalk customer around the premises

Intentionally make customer feel anxious, intimidated, and harassed

Laugh if customer tries to complain about biased treatment

Make certain that customer does not return to location to shop

Because color of skin

Mission accomplished

AMAZON’S CEO ON HIS “BREAK-THEN-FIX” APPROACH TO FACIAL RECOGNITION TECHNOLOGY

Because we are living in the Twilight Zone, herein lies the thoughts of Amazon’s CEO regarding the contribution of Amazon facial recognition technology to the actual or potential violations of civil or human rights. On the potential for the technology to endanger, threaten, or violate privacy and or civil rights. On the potential of Amazon’s facial recognition technology to disproportionately impact people of color, immigrants, and activists.

Amazon Rekognition has a 68% classification accuracy rate for women of color relative to 100% for white men
"Eventually, society will develop an 'immune response' to bad uses of
 technology."

- Jeff Bezos 

Source: CNN Amazon Will Keep Working With the DoD (Oct., 2018)

Welcome to the present.

PROMINENT TECH RESEARCHERS PEN OPEN LETTER TO AMAZON CITING RACIAL BIAS IN FACIAL REKOGNITION SOFTWARE

"Overall, we find [Amazon's] response to peer-reviewed research
 findings disappointing...We call on Amazon to stop selling Rekognition."

 Concerned Researchers.
Amazon Rekognition surveillance software has a low accuracy rate of 68% for women of color – relative to a 100% accuracy rate for white males

Prominent artificial intelligence researchers in tech, concerned about violations of the civil rights of American citizens and potential loss of liberties, have responded to Amazon’s defensive hostility in On Recent Research Auditing Commercial Facial Analysis Technology. Visit HERE to read the tech communities’ open letter to the tech giant.

CONCERNED RESEARCHERS

Ali Alkhatib, Stanford University

Noura Al Moubayed, Durham University

Miguel Alonso Jr, Florida International University

Anima Anandkumar, Caltech (formerly Principal Scientist at AWS)

Akilesh Badrinaaraayanan, MILA/University of Montreal

Esube Bekele, National Research Council fellow

Yoshua Bengio, MILA/University of Montreal

Alex Berg, UNC Chapel Hill

Miles Brundage, OpenAI; Oxford; Axon AI Ethics Board

Dan Calacci, Massachusetts Institute of Technology

Pablo Samuel Castro, Google

Abir Das, IIT Kharagpur

Hal Daumé III, Microsoft Research and University of Maryland

Maria De-Arteaga, Carnegie Mellon University

Mostafa Dehghani, University of Amsterdam

Emily Denton, Google

Lucio Dery, Facebook AI Research

Priya Donti, Carnegie Mellon University

Hamid Eghbal-zadeh, Johannes Kepler University Linz

Paul Feigelfeld, IFK Vienna, Strelka Institute

Jessica Finocchiaro, University of Colorado Boulder

Andrea Frome, Google

Field Garthwaite, IRIS.TV

Timnit Gebru, Google

Sebastian Gehrmann, Harvard University

Georgia Gkioxari, Facebook AI Research

Alvin Grissom II, Ursinus College

Sergio Guadarrama, Google

Alex Hanna, Google

Bernease Herman, University of Washington

William Isaac, Deep Mind

Alexia Jolicoeur-Martineau, MILA/University of Montreal

Yannis Kalantidis, Facebook AI

Khimya Khetarpal, MILA/McGill University

Michael Kim, Stanford University

Morgan Klaus Scheuerman, University of Colorado Boulder

Hugo Larochelle, Google/MILA

Hugo Larochelle, Google/MILA

Erik Learned-Miller, UMass Amherst

Xing Han Lu, McGill University

Kristian Lum, Human Rights Data Analysis Group

Michael Madaio, Carnegie Mellon University

Tegan Maharaj, Mila/École Polytechnique

João Martins, Carnegie Mellon University

El Mahdi El Mhamdi, Ecole Polytechnique Fédérale de Lausanne

Vincent Michalski, MILA/University of Montreal

Margaret Mitchell, Google

Melanie Mitchell, Portland State University and Santa Fe Institute

Ioannis Mitliagkas, MILA/University of Montreal

Bhaskar Mitra, Microsoft and University College London

Jamie Morgenstern, Georgia Institute of Technology

Bikalpa Neupane, Pennsylvania State University, UP

Ifeoma Nwogu, Rochester Institute of Technology

Vicente Ordonez-Roman, University of Virginia

Pedro O. Pinheiro

Vinodkumar Prabhakaran, Google

Parisa Rashidi, University of Florida

Anna Rohrbach, UC Berkeley

Daniel Roy, University of Toronto

Negar Rostamzadeh

Kate Saenko, Boston University

Niloufar Salehi, UC Berkeley

Anirban Santara, IIT Kharagpur (Google PhD Fellow)

Brigit Schroeder, Intel AI Lab

Laura Sevilla-Lara, University of Edinburgh

Shagun Sodhani, MILA/University of Montreal

Biplav Srivastava

Luke Stark, Microsoft Research Montreal

Rachel Thomas, fast.ai; University of San Francisco

Briana Vecchione, Cornell University

Toby Walsh, UNSW Sydney

Serena Yeung, Harvard University

Yassine Yousfi, Binghamton University

Richard Zemel, Vector & University of Toronto

WHOLE FOODS MARKET PERSONAL SHOPPER SERVICE FOR MINORITIES ONLY – SPECIAL TREATMENT EDITION

Whole Foods Market seems to have added a personal security detail service for minority shoppers only. For the whole price of nothing, the undercover loss prevention officer will follow you around the whole store to ensure that you find all of your items okay, also making sure that you pay for them. Just in case.

Just don’t try using EBT as a form of payment.

Be sure to ask them about this special service next time you visit a Whole Foods Market location.

THE CASE OF ALGORITHMIC DISCRIMINATION AND HOW IT ADVERSELY AFFECTS MINORITiES – WOMEN’S HISTORY MONTH EDITION

MIT researchers Joy Buolamwini, Deborah Raji, Stanford’s Timnit Gebru, and others, have as part of a team dedicated remarkable efforts to the demo testing of the Artificial Intelligence (AI) in facial recognition software and the synthesis of its data output. Their exhaustive study of bias in AI concluded that Amazon’s facial recognition software Rekognition, a software being sold to law enforcement and also used for retail security surveillance, exhibits significant gender and racial bias for gender classification. In Joy’s MIT thesis titled “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification“, the team’s findings concluded that this racial and gender bias in surveillance AI adversely affects women and minority groups who are then subject to increased surveillance, searches, interrogations, and potential loss of liberties. Writer and CNET News reporter Alfred Ng wrote of the software that lack of standards to prevent abuse and AI bias could “get you banned from places you’ve never been” (CNET.com), as store security is free to share their biased data about you across a vast network.

So next time you are browsing in a Whole Foods Market or a Target and you find yourself being stalked while Black by store security, please do think about the below findings and whether or not you are comfortable with this:

“…An error in the output of a face recognition algorithm used as [a deciding factor in surveillance] can have serious consequences. For example, someone could be wrongfully accused of a crime based on erroneous but confident misidentification of the perpetrator from security video footage analysis.”

Screen capture of actual data output using the face of TV personality Oprah Winfrey
Amazon Rekognition software demo (Source: “Ain’t I A Woman”)

“In other contexts, a demographic group that is underrepresented in benchmark datasets can nonetheless be subjected to frequent targeting…False positives and unwarranted searches pose a threat to civil liberties. Some face recognition systems have been shown to misidentify people of color, women, and young people at high rates (Klare et al., 2012).”

Screen capture of actual data output using the face of Civil Rights activist Ida B. Wells
Amazon Rekognition software demo (Source: “Ain’t I A Woman”)

“Past research has also shown that the accuracies of face recognition systems used by US-based law enforcement are systematically lower for people labeled female, Black, or between the ages of 18-30 than for other demographic cohorts (Klare et al., 2012). The latest gender classification report from the National Institute for Standards and Technology (NIST) also shows that algorithms [that the] NIST evaluated performed worse for female-labeled faces than male-labeled faces.” (source: Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification)

Help fight bias by sharing your experience or concerns with surveillance AI technology HERE.

DEAR TECH COMPANY – WHOLE FOODS STORE SURVEILLANCE EDITION

Dear Whole Foods security and surveillance team.

How do you use the thousands of Whole Foods customer photos that you capture and store in your security surveillance database? Who do you sell that information to? Who do you share that information with?

How many women are misclassified in your surveillance database as men?

How many individuals are misclassified in your surveillance database as having a criminal history? What is the measurement error of that?

How many individuals are misclassified in your surveillance database as undocumented?

How many individuals are misclassified in your surveillance database as homeless?

How many individuals are misclassified in your surveillance database as suspicious?

How many women have their home address, their work address, their vehicle make and model, their median income, their marriage status, their occupation pop-up on your surveillance screen the very second that they walk into a Whole Foods Market to buy lentils?

How many women are followed by an undercover loss prevention officer while shopping at Whole Foods because your little surveillance screen misclassifies them as having a predisposition to criminality?

How many individuals, on average, does your biased surveillance AI racially profile?

Since when did buying groceries become so f****ng taxing?




THE WEAPONIZATION OF FACIAL RECOGNITION SOFTWARE – WHOLE FOODS RETAIL EDITION

Ever wondered why they’re following YOU specifically?

According to a recent article published by the New York Times, MIT research has shown that facial recognition software is not coded to correctly identify and classify human beings of all genders, skin tones, and facial structures without demonstrating a biased output that benefits mostly White males. Though some facial recognition software systems claim to have the technology to detect criminality in the faces of public citizens, Amazon’s face surveillance technology Rekognition in fact falsely identified 28 members of Congress as criminals in a user test conducted by the American Civil Liberties Union.

Rekogition software has a 32% error rate in analyzing the faces of women or color

Despite the data-science proof of gender and racial bias in Amazon’s retail surveillance software, women and men of all races and backgrounds continue to be surveilled, monitored, and their images stored in Amazon’s facial recognition database for private use and/or possible misuse. Unregulated and unchecked by any agency.

To read more about Amazon’s commercial facial analysis system Rekogntion, and how it may be affecting you, your family, and your community, please take the time to read any of the below sources about the algorithmically race based bias in surveillance software:

Welcome To The Era Of Facial Recognition Technology: Changing The Game On How Minorities Are Treated

Racial And Gender Bias in Amazon Rekognition – Commerical AI Systems For Analyzing Faces

TED Talk: How I’m Fighting Bias In Algorithms

ACLU: Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Facial Recognition Fail: Amazon Project Mistakes Lawmakers With Suspects

Study Finds Racial Bias in Amazon’s Facial Recognition Tech

THE DISCRIMINATION SAGA CONTINUES AT WHOLE FOODS – WOMEN’S HISTORY MONTH EDITION

Women.

Caretakers, lovers, cancer survivors, fixers, friends, motivators, social workers, healers, historians, architects, novelists, screenwriters, artists, performers, daughters, sisters, practitioners, and also the very second that they enter into a Whole Foods they are nothing but a common thief.

Happy Women’s History Month!

A WHOLE BOOK ON HOW WHOLE FOODS SYSTEMATICALLY DISCRIMINATES – WOMEN’S HISTORY MONTH EDITION

Women.

Shareholders, decision makers, CEOs, project managers, philosophers, accountants, coders, analysts, counselors, broadcasters, confidants, campaigners, and also tracked by facial recognition surveillance and deemed suspicious the very minute that they step inside of a Whole Foods Market.

Happy Women’s History Month!

A WHOLE CHAPTER ON THE SPECIAL TREATMENT OF WOMEN AT WHOLE FOODS MARKET – WOMEN’S HISTORY MONTH EDITION

Women.

Leaders, managers, influencers, editors, strategists, child bearers, home makers, teachers, directors, advisors, producers, photographers, storytellers, doctors, and also the minute that they walk into a Whole Foods, potentially thieves that should be closely monitored.

Happy Women’s History Month!!

CORPORATE CRIME FAM TRIO PUTTING RESOURCES TOWARDS THE HARASSMENT OF NEW YORK WRITER

Dear Whole Foods Mafia readers.

Okay I’ve got to put the Whole Foods Mob Family Tree on blast once again because the man seen in the photos below was outside of my apartment building this afternoon snapping pictures of me to enter into Amazon’s facial recognition surveillance database.  Apparently I am famous now so I asked him to make sure to get my good side.  I’m hoping that he honored that. It would have been really cool if he could have stalked me all the way to Trader Joe’s and got candids of me walking in without being  racially profiled and creepily stalked by undercover loss prevention officers.

Still the same ol’ me, I am now fearless girl with a bullet proof vest on.

 #SuperActivist  #NotScaredToExist  #MinorityIsNotACrime