Bias & Discrimination
AI systems producing racist, sexist, or discriminatory outputs at scale.
3 disasters cataloged
Microsoft Launched an AI Chatbot. The Internet Made It a Nazi in 16 Hours.
Tay was Microsoft's attempt to build a fun, learning AI for Twitter. Within a day, it was tweeting Holocaust denial, praising Hitler, and calling for genocide. Microsoft had not anticipated this.
She Had Never Left Tennessee. AI Said She Was in North Dakota Committing Bank Fraud. She Spent Six Months in Jail.
Angela Lipps, a 50-year-old grandmother of five, was arrested at gunpoint, jailed for nearly six months, and lost her home, her car, and her dog — because facial recognition software said it was her. It wasn't.
Amazon Rekognition Matched 28 Members of Congress to a Criminal Database
The ACLU ran Amazon's facial recognition system against all 535 members of Congress. It produced 28 false matches — and nearly 40% of them were people of color.