Incidents impliqués en tant que développeur et déployeur
Incident 4529 Rapports
Defamation via AutoComplete
2011-04-05
Google's autocomplete feature alongside its image search results resulted in the defamation of people and businesses.
PlusIncident 7128 Rapports
Google admits its self driving car got it wrong: Bus crash was caused by software
2016-09-26
On February 14, 2016, a Google autonomous test vehicle partially responsible for a low-speed collision with a bus on El Camino Real in Google’s hometown of Mountain View, CA.
PlusIncident 1927 Rapports
Sexist and Racist Google Adsense Advertisements
2013-01-23
Advertisements chosen by Google Adsense are reported as producing sexist and racist results.
PlusIncident 1624 Rapports
Images of Black People Labeled as Gorillas
2015-06-03
Google Photos image processing software mistakenly labelled a black couple as "gorillas."
PlusAffecté par des incidents
Incident 46714 Rapports
Google's Bard Shared Factually Inaccurate Info in Promo Video
2023-02-07
Google's conversational AI "Bard" was shown in the company's promotional video providing false information about which satellite first took pictures of a planet outside the Earth's solar system, reportedly causing shares to temporarily plummet.
PlusIncidents involved as Developer
Incident 4693 Rapports
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
PlusIncident 121 Rapport
Common Biases of Vector Embeddings
2016-07-21
Researchers from Boston University and Microsoft Research, New England demonstrated gender bias in the most common techniques used to embed words for natural language processing (NLP).
PlusIncident 811 Rapport
Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers
2020-10-21
A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.
Plus