In response to news that the ACLU of Michigan is lodging a complaint against Detroit police on behalf of a black man falsely arrested because facial recognition technology it uses cannot consistently and accurately tell black people apart, a security expert offers perspective on facial reco and the tech sector’s diversity gap.
When there is not a diverse team of people on the product development team for such a sensitive technology with the potential for intrusiveness on an individual level, this problem – and it is a catastrophic one – is all but inevitable. It\’s quite likely that the majority of technology creators are not underrepresented persons and create products with algorithmic bias. ALL people are biased; that’s human nature and shows in products. We assume that if a technology works for us, it will work for others who aren’t like us – a wrong and dangerous assumption.
This is incredibly problematic since statistics show that minorities are far more likely to be falsely convicted of crimes. A 2017 study for the National Registry of Exonerations looked at years of exoneration data to better understand how race might influence whether someone is wrongfully convicted and then ultimately exonerated of crimes (a VOX story on the study). Findings show that innocent black people are 7x more likely to be falsely convicted of murder, 3.5x more likely to be wrongly convicted of sexual assault crimes, and 12x more likely to be unjustly convicted of drug crimes.
When you factor in on top of this misidentification by facial reco tech the long history of blacks being convicted on flimsy heresy evidence, the use of facial reco tech needs a serious examination before being put into practice.
There are other factors to consider before adopting facial recognition technologies as well. It automates surveillance without a legal or regulatory framework. It may violate the individual’s right to privacy because it denies citizens of the opportunity to give or withhold consent, and in doing so, infringes on the right to freedom of association, assembly, and expression. It’s being used on protesters now and is often inaccurate. Inherent algorithmic bias can be used to discriminate.
These are some reasons why – at a minimum – notification that facial recognition is in use is key, and preferably, consent should be requested and secured. It’s also why groups such as The Innocence Project deserve support.