Incident 43: Racist AI behaviour is not a new problem
Entities
View all entitiesIncident Stats
CSETv1 Taxonomy Classifications
Taxonomy DetailsHarm Distribution Basis
nation of origin, citizenship, immigrant status, sex, race
Sector of Deployment
Education, human health and social work activities
CSETv0 Taxonomy Classifications
Taxonomy DetailsFull Description
From 1982 to 1986, St George's Hospital Medical School used a program to autonomously select candidates for admissions interviews. The system, designed by staff member Dr. Geoffrey Franglen, used past admission data to select potential students based on their standardized university applications. After the program achieved 90-95% match with the admission panel’s selection of interview candidates, it was entrusted as the primary method to conduct initial applicant screening. In 1986, lecturers at the school recognized that the system was biased against women and members of ethnic minorities and reported the issue to Britain’s Commission for Racial Equality.
Short Description
From 1982 to 1986, St George's Hospital Medical School used a program to automate a portion of their admissions process that resulted in discrimination against women and members of ethnic minorities.
Severity
Moderate
Harm Distribution Basis
Race, Sex
Harm Type
Harm to civil liberties
AI System Description
A custom designed statistical analysis program that used data from past admissions decisions to select which university applicants to be given admissions interviews.
System Developer
Dr. Geoffrey Franglen
Sector of Deployment
Human health and social work activities
Relevant AI functions
Cognition
AI Techniques
Machine learning
AI Applications
decision support
Location
London, England
Named Entities
St George’s Hospital Medical School, Dr. Geoffrey Franglen, University Central Council for Admission, Commission for Racial Equality
Technology Purveyor
St George’s Hospital Medical School, Dr. Geoffrey Franglen
Beginning Date
1982-01-01T00:00:00.000Z
Ending Date
1986-01-01T00:00:00.000Z
Near Miss
Harm caused
Intent
Accident
Lives Lost
No
Laws Implicated
United Kingdom's Race Relations Act
Data Inputs
Standardized university admission form, Previous admission and regection decisions
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
A Blot on the Profession
Discrimination in medicine against women and members of ethnic minorities has long been suspected, but it has now been proved. St George's Hospital Medical School has been found guilty by the Commission for Racial E…
- View the original report at its source
- View the report at the Internet Archive
As AI spreads, this will become an increasingly important and controversial issue:
For one British university, what began as a time-saving exercise ended in disgrace when a computer model set up to streamline its admissions process exposed …
- View the original report at its source
- View the report at the Internet Archive
Professor Margaret Boden, an AI and cognitive science researcher, took the time to speak to me in 2010 about computers, AI, morality and the future. One of the stories she told me comes back to me every now and then, most recently by Micros…
- View the original report at its source
- View the report at the Internet Archive
Companies and governments need to pay attention to the unconscious and institutional biases that seep into their algorithms, argues cybersecurity expert Megan Garcia. Distorted data can skew results in web searches, home loan decisions, or …
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
AI Beauty Judge Did Not Like Dark Skin
Sexist and Racist Google Adsense Advertisements
Similar Incidents
Did our AI mess up? Flag the unrelated incidents