Skip to Content
logologo
AI Incident Database
  • English
  • Español
  • FrançaisBeta
Open TwitterOpen RSS FeedOpen RSS FeedOpen RSS FeedOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Entities

YouTube young users

Incidents Harmed By

Incident 2813 Reports
YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

2019-02-04

Terms-of-service-violating videos related to suicide and self-harm reportedly bypassed YouTube’s content moderation algorithms, allegedly resulting in exposure of graphic content to young users via recommended videos.

More

Related Entities

Entity

YouTube

Incidents involved as both Developer and Deployer
  • Incident 281
    3 Reports

    YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

More
Entity

YouTube users

Incidents Harmed By
  • Incident 281
    3 Reports

    YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • About Apps
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

Terms of use
Privacy Policy
Open TwitterOpen githubOpen RSS FeedOpen FacebookOpen Linked In