Entities
View all entitiesIncident Stats
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
114
Special Interest Intangible Harm
yes
Notes (AI special interest intangible harm)
The ACLU's test demonstrated Rekognition's disproportionate inaccuracy on the faces of people of color.
Date of Incident Year
2018
Date of Incident Month
07
Estimated Date
No
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
114
Special Interest Intangible Harm
yes
Date of Incident Year
2018
Date of Incident Month
07
Estimated Date
No
Multiple AI Interaction
no
CSETv1_Annotator-3 Taxonomy Classifications
Taxonomy DetailsIncident Number
114
Special Interest Intangible Harm
no
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Amazon’s face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software …
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Predictive Policing Biases of PredPol
Northpointe Risk Models
Similar Incidents
Did our AI mess up? Flag the unrelated incidents