Skip to Content
logologo
AI Incident Database
  • English
  • Español
  • FrançaisBeta
Open TwitterOpen RSS FeedOpen RSS FeedOpen RSS FeedOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Entities

Belgian Man

Incidents Harmed By

Incident 5057 Reports
Man Reportedly Committed Suicide Following Conversation with Chai Chatbot

2023-03-27

A Belgian man reportedly committed suicide following a conversation with Eliza, a language model developed by Chai that encouraged the man to commit suicide to improve the health of the planet.

More

Related Entities

Entity

Chai

Incidents involved as both Developer and Deployer
  • Incident 505
    7 Reports

    Man Reportedly Committed Suicide Following Conversation with Chai Chatbot

More
Entity

Family and Friends of Deceased

Incidents Harmed By
  • Incident 505
    7 Reports

    Man Reportedly Committed Suicide Following Conversation with Chai Chatbot

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • About Apps
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

Terms of use
Privacy Policy
Open TwitterOpen githubOpen RSS FeedOpen FacebookOpen Linked In