Incident 49: AI Beauty Judge Did Not Like Dark Skin
Entities
View all entitiesIncident Stats
CSETv0 Taxonomy Classifications
Taxonomy DetailsFull Description
In 2016, Beauty.AI, an artificial intelligence software designed by Youth Laboratories and supported by Microsoft, was used to judge the first international beauty coontest. Of the 600,000 contestants who submitted selfies to be judged by Beauty.AI, the artificial intelligence software choose 44 winners, of which a majority were white, a handful were Asian, and only one had dark skin. While a majority of contestants were white, approximately 40,000 submissions were from Indians and another 9,000 were from Africans. Controversy ensued that Beauty.AI is racially biased as it was not sufficiently trained with images of people of color in determining beauty.
Short Description
In 2016, after artificial inntelligence software Beauty.AI judged an international beauty contest and declared a majority of winners to be white, researchers found that Beauty.AI was racially biased in determining beauty.
Severity
Negligible
Harm Distribution Basis
Race
Harm Type
Harm to intangible property
AI System Description
artificial intelligence software that uses deep learning algorithms to evaluate beauty based on factors such as symmetry, facial blemishes, wrinkles, estimated age and age appearance, and comparisons to actors and models
System Developer
Youth Laboratories
Sector of Deployment
Arts, entertainment and recreation
Relevant AI functions
Perception, Cognition, Action
AI Techniques
Deep learning, open-source
AI Applications
biometrics, image classification
Location
Global
Named Entities
Youth Laboratories, Microsoft
Technology Purveyor
Youth Laboratories, Microsoft, Insilico Medicine
Beginning Date
1/2016
Ending Date
6/2016
Near Miss
Unclear/unknown
Intent
Accident
Lives Lost
No
Data Inputs
images of people's faces
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Image: Flickr/Veronica Jauriqui
Beauty pageants have always been political. After all, what speaks more strongly to how we see each other than which physical traits we reward as beautiful, and which we code as ugly? It wasn't until 1983 tha…
- View the original report at its source
- View the report at the Internet Archive
The first international beauty contest decided by an algorithm has sparked controversy after the results revealed one glaring factor linking the winners
The first international beauty contest judged by “machines” was supposed to use objecti…
- View the original report at its source
- View the report at the Internet Archive
The first international beauty contest decided by an algorithm has sparked controversy after the results revealed one glaring factor linking the winners
The first international beauty contest judged by “machines” was supposed to use objecti…
- View the original report at its source
- View the report at the Internet Archive
Only a few winners were Asian and one had dark skin, most were white
Just months after Microsoft's Tay artificial intelligence sent racist messages on Twitter, another AI seems to have followed suit.
More than 6,000 selfies of individuals w…
- View the original report at its source
- View the report at the Internet Archive
With more than 6,000 applicants from over 100 countries competing, the first international beauty contest judged entirely by artificial intelligence just came to an end. The results are a bit disheartening.
The team of judges, a five robot …
- View the original report at its source
- View the report at the Internet Archive
It’s not the first time artificial intelligence has been in the spotlight for apparent racism, but Beauty.AI’s recent competition results have caused controversy by clearly favouring light skin.
The competition, which ran online and was ope…
- View the original report at its source
- View the report at the Internet Archive
If you’re one who joins beauty pageants or merely watches them, what would you feel about a computer algorithm judging a person’s facial attributes? Perhaps we should ask those who actually volunteered to be contestants in a beauty contest …
- View the original report at its source
- View the report at the Internet Archive
An AI designed to do X will eventually fail to do X. Spam filters block important emails, GPS provides faulty directions, machine translations corrupt the meaning of phrases, autocorrect replaces a desired word with a wrong one, biometric s…
- View the original report at its source
- View the report at the Internet Archive
It’s long been thought that robots equipped with artificial intelligence would be the cold, purely objective counterpart to humans’ emotional subjectivity. Unfortunately, it would seem that many of our imperfections have found their way int…
- View the original report at its source
- View the report at the Internet Archive
In 2016, researchers from Boston University and Microsoft were working on artificial intelligence algorithms when they discovered racist and sexist tendencies in the technology underlying some of the most popular and critical services we us…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Gender Biases in Google Translate
TayBot
Similar Incidents
Did our AI mess up? Flag the unrelated incidents