Microsoft
Incidents impliqués en tant que développeur et déployeur
Incident 628 Rapports
TayBot
2016-03-24
Microsoft's Tay, an artificially intelligent chatbot, was released on March 23, 2016 and removed within 24 hours due to multiple racist, sexist, and anit-semitic tweets generated by the bot.
PlusIncident 12712 Rapports
Microsoft’s Algorithm Allegedly Selected Photo of the Wrong Mixed-Race Person Featured in a News Story
2020-06-06
A news story published on MSN.com featured a photo of the wrong mixed-race person that was allegedly selected by an algorithm, following Microsoft’s layoff and replacement of journalists and editorial workers at its organizations with AI systems.
PlusIncident 5037 Rapports
Bing AI Search Tool Reportedly Declared Threats against Users
2023-02-14
Users such as the person who revealed its built-in initial prompts reported Bing AI-powered search tool for making death threats or declaring them as threats, sometimes as an unintended persona.
PlusIncident 4776 Rapports
Bing Chat Tentatively Hallucinated in Extended Conversations with Users
2023-02-14
Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.
PlusAffecté par des incidents
Incident 6616 Rapports
Chinese Chatbots Question Communist Party
2017-08-02
Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.
PlusIncident 5037 Rapports
Bing AI Search Tool Reportedly Declared Threats against Users
2023-02-14
Users such as the person who revealed its built-in initial prompts reported Bing AI-powered search tool for making death threats or declaring them as threats, sometimes as an unintended persona.
PlusIncident 4776 Rapports
Bing Chat Tentatively Hallucinated in Extended Conversations with Users
2023-02-14
Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.
PlusIncident 4702 Rapports
Bing Chat Response Cited ChatGPT Disinformation Example
2023-02-08
Reporters from TechCrunch issued a query to Microsoft Bing's ChatGPT feature, which cited an earlier example of ChatGPT disinformation discussed in a news article to substantiate the disinformation.
PlusIncidents involved as Developer
Incident 6616 Rapports
Chinese Chatbots Question Communist Party
2017-08-02
Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.
PlusIncident 1884 Rapports
Argentinian City Government Deployed Teenage-Pregnancy Predictive Algorithm Using Invasive Demographic Data
2018-04-11
In 2018, during the abortion-decriminalization debate in Argentina, the Salta city government deployed a teenage-pregnancy predictive algorithm built by Microsoft that allegedly lacked a defined purpose, explicitly considered sensitive information such as disability and whether their home had access to hot water.
PlusIncident 4693 Rapports
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
PlusEntités associées
Tencent Holdings
Affecté par des incidents
- Incident 6616 Rapports
Chinese Chatbots Question Communist Party
- Incident 6616 Rapports
Chinese Chatbots Question Communist Party
Incidents involved as Deployer
Turing Robot
Affecté par des incidents
- Incident 6616 Rapports
Chinese Chatbots Question Communist Party
- Incident 6616 Rapports
Chinese Chatbots Question Communist Party
Incidents involved as Developer
Incidents impliqués en tant que développeur et déployeur
Incidents involved as Developer
Amazon
Incidents impliqués en tant que développeur et déployeur
Incidents involved as Developer
OpenAI
Affecté par des incidents
- Incident 5037 Rapports
Bing AI Search Tool Reportedly Declared Threats against Users
- Incident 5037 Rapports
Bing AI Search Tool Reportedly Declared Threats against Users