Skip to Content
logologo
AI Incident Database
  • English
  • Español
  • FrançaisBeta
Open TwitterOpen RSS FeedOpen RSS FeedOpen RSS FeedOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • Entities
  • Taxonomies
  • Word Counts
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Subscribe
Collapse
Entities

a software engineer named Mark

Incidents Harmed By

Incident 3032 Reports
Google’s Automated Child Abuse Detection Wrongfully Flagged a Parent’s Naked Photo of His Child

2022-08-21

Google’s automated detection of abusive images of children incorrectly flagged a parent’s photo intended for a healthcare provider, resulting in a false police report of child abuse, and loss of access to his online accounts and information.

More

Related Entities

Entity

Google

Incidents involved as both Developer and Deployer
  • Incident 303
    2 Reports

    Google’s Automated Child Abuse Detection Wrongfully Flagged a Parent’s Naked Photo of His Child

More
Entity

parents using telemedicine services

Incidents Harmed By
  • Incident 303
    2 Reports

    Google’s Automated Child Abuse Detection Wrongfully Flagged a Parent’s Naked Photo of His Child

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • About Apps
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

Terms of use
Privacy Policy
Open TwitterOpen githubOpen RSS FeedOpen FacebookOpen Linked In