Skip to Content
logologo
AI Incident Database
  • English
  • Español
  • FrançaisBeta
Open TwitterOpen RSS FeedOpen RSS FeedOpen RSS FeedOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Entities

Twitter non-white users

Incidents Harmed By

Incident 1035 Reports
Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias

2020-09-18

Twitter's photo cropping algorithm was revealed by researchers to favor white and women faces in photos containing multiple faces, prompting the company to stop its use on mobile platform.

More

Related Entities

Entity

Twitter

Incidents involved as both Developer and Deployer
  • Incident 103
    5 Reports

    Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias

More
Entity

Twitter Users

Incidents Harmed By
  • Incident 103
    5 Reports

    Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias

More
Entity

Twitter non-male users

Incidents Harmed By
  • Incident 103
    5 Reports

    Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • About Apps
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

Terms of use
Privacy Policy
Open TwitterOpen githubOpen RSS FeedOpen FacebookOpen Linked In