Incidentes involucrados como desarrollador e implementador
Incidente 4529 Reportes
Defamation via AutoComplete
2011-04-05
Google's autocomplete feature alongside its image search results resulted in the defamation of people and businesses.
MásIncidente 7128 Reportes
Google admits its self driving car got it wrong: Bus crash was caused by software
2016-09-26
On February 14, 2016, a Google autonomous test vehicle partially responsible for a low-speed collision with a bus on El Camino Real in Google’s hometown of Mountain View, CA.
MásIncidente 1927 Reportes
Sexist and Racist Google Adsense Advertisements
2013-01-23
Advertisements chosen by Google Adsense are reported as producing sexist and racist results.
MásIncidente 1624 Reportes
Images of Black People Labeled as Gorillas
2015-06-03
Google Photos image processing software mistakenly labelled a black couple as "gorillas."
MásAfectado por Incidentes
Incidente 46714 Reportes
Google's Bard Shared Factually Inaccurate Info in Promo Video
2023-02-07
Google's conversational AI "Bard" was shown in the company's promotional video providing false information about which satellite first took pictures of a planet outside the Earth's solar system, reportedly causing shares to temporarily plummet.
MásIncidents involved as Developer
Incidente 4693 Reportes
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
MásIncidente 121 Reporte
Common Biases of Vector Embeddings
2016-07-21
Researchers from Boston University and Microsoft Research, New England demonstrated gender bias in the most common techniques used to embed words for natural language processing (NLP).
MásIncidente 811 Reporte
Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers
2020-10-21
A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.
Más