Incident 59: Gender Biases in Google Translate
Entities
View all entitiesIncident Stats
CSETv0 Taxonomy Classifications
Taxonomy DetailsFull Description
A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women. When translating from non-gendered languages (ex. Turkish, Finnish), Google Translate added gender to the phrases being translated. "Historian" "Doctor" "President" "Engineer" and "Soldier" were assigned male gender pronouns while "Nurse" "Teacher" and "Shop Assistant" were assigned female gender pronouns.
Short Description
A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women.
Severity
Negligible
Harm Distribution Basis
Sex
Harm Type
Harm to social or political systems
AI System Description
Google Translate, a software allowing for translations between many languages
System Developer
Sector of Deployment
Information and communication
Relevant AI functions
Perception, Cognition, Action
AI Techniques
Google Translate
AI Applications
language API, language translation
Named Entities
Google Translate, Google
Technology Purveyor
Beginning Date
2016-01-01T00:00:00.000Z
Ending Date
2016-01-01T00:00:00.000Z
Near Miss
Harm caused
Intent
Unclear
Lives Lost
No
Data Inputs
User entered translation requests
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Artificial intelligence and machine learning are in a period of astounding growth. However, there are concerns that these technologies may be used, either with or without intention, to perpetuate the prejudice and unfairness that unfortunat…
- View the original report at its source
- View the report at the Internet Archive
Even artificial intelligence can acquire biases against race and gender
One of the great promises of artificial intelligence (AI) is a world free of petty human biases. Hiring by algorithm would give men and women an equal chance at work, t…
- View the original report at its source
- View the report at the Internet Archive
Machine learning algorithms are picking up deeply ingrained race and gender prejudices concealed within the patterns of language use, scientists say
An artificial intelligence tool that has revolutionised the ability of computers to interpr…
- View the original report at its source
- View the report at the Internet Archive
In debates over the future of artificial intelligence, many experts think of these machine-based systems as coldly logical and objectively rational. But in a new study, Princeton University-based researchers have demonstrated how machines c…
- View the original report at its source
- View the report at the Internet Archive
In the Turkish language, there is one pronoun, “o,” that covers every kind of singular third person. Whether it’s a he, a she, or an it, it’s an “o.” That’s not the case in English. So when Google Translate goes from Turkish to English, it …
- View the original report at its source
- View the report at the Internet Archive
So much of our life is determined by algorithms. From what you see on your Facebook News Feed, to the books and knickknacks recommended to you by Amazon, to the disturbing videos YouTube shows to your children, our attention is systematical…
- View the original report at its source
- View the report at the Internet Archive
Image via Twitter
Parents know one particular challenge of raising kids all too well: teaching them to do what we say, not what we do.
A similar challenge has hit artificial intelligence.
As more apps and software use AI to automate tasks, …
- View the original report at its source
- View the report at the Internet Archive
Recently there has been a growing concern about machine bias, where trained statistical models grow to reflect controversial societal asymmetries, such as gender or racial bias. A significant number of AI tools have recently been suggested …
- View the original report at its source
- View the report at the Internet Archive
Google is making an effort to reduce perceived gender bias in Google Translate, it announced today. Starting this week, users who translate words and phrases in supported languages will get both feminine and masculine translations; “o bir d…
- View the original report at its source
- View the report at the Internet Archive
An experiment shows that Google Translate systematically changes the gender of translations when they do not fit with stereotypes. It is all because of English, Google says.
If you were to read a story about male and female historians trans…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
AI Beauty Judge Did Not Like Dark Skin
Biased Sentiment Analysis
Similar Incidents
Did our AI mess up? Flag the unrelated incidents