Incident 179: DALL-E 2 Reported for Gender and Racially Biased Outputs
Entities
View all entitiesIncident Stats
GMF Taxonomy Classifications
Taxonomy DetailsKnown AI Goal
Visual Art Generation
Known AI Technology
Transformer, Distributional Learning
Known AI Technical Failure
Distributional Bias, Unsafe Exposure or Access, Misinformation Generation Hazard, Inappropriate Training Content
Potential AI Technical Failure
Unauthorized Data, Lack of Transparency
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Summary
Below, we summarize initial findings on potential risks associated with DALL·E 2, and mitigations aimed at addressing those risks as part of the ongoing Preview of this technology. We are sharing these findings in order to enable br…
- View the original report at its source
- View the report at the Internet Archive
You may have seen some weird and whimsical pictures floating around the internet recently. There’s a Shiba Inu dog wearing a beret and black turtleneck. And a sea otter in the style of “Girl with a Pearl Earring” by the Dutch painter Vermee…
- View the original report at its source
- View the report at the Internet Archive
Researchers experimenting with OpenAI's text-to-image tool, DALL-E 2, noticed that it seems to covertly be adding words such as "black" and "female" to image prompts, seemingly in an effort to diversify its output
Artificial intelligence fi…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents