Entities

Google

Incidents involved as both Developer and Deployer

Incident 4529 Reports
Defamation via AutoComplete

2011-04-05

Google's autocomplete feature alongside its image search results resulted in the defamation of people and businesses.

More

Incident 7128 Reports
Google admits its self driving car got it wrong: Bus crash was caused by software

2016-09-26

On February 14, 2016, a Google autonomous test vehicle partially responsible for a low-speed collision with a bus on El Camino Real in Google’s hometown of Mountain View, CA.

More

Incident 1927 Reports
Sexist and Racist Google Adsense Advertisements

2013-01-23

Advertisements chosen by Google Adsense are reported as producing sexist and racist results.

More

Incident 1624 Reports
Images of Black People Labeled as Gorillas

2015-06-03

Google Photos image processing software mistakenly labelled a black couple as "gorillas."

More

Incidents Harmed By

Incident 46714 Reports
Google's Bard Shared Factually Inaccurate Info in Promo Video

2023-02-07

Google's conversational AI "Bard" was shown in the company's promotional video providing false information about which satellite first took pictures of a planet outside the Earth's solar system, reportedly causing shares to temporarily plummet.

More

Incidents involved as Developer

Incident 4693 Reports
Automated Adult Content Detection Tools Showed Bias against Women Bodies

2006-02-25

Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.

More

Incident 121 Report
Common Biases of Vector Embeddings

2016-07-21

Researchers from Boston University and Microsoft Research, New England demonstrated gender bias in the most common techniques used to embed words for natural language processing (NLP).

More

Incident 811 Report
Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

2020-10-21

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

More

Related Entities