fbpx
Business

U.S. government study on facial recognition systems reveals racial, gender biases

The study is increasing already present concerns regarding the technology

A U.S. government study has revealed that many facial recognition systems misidentify people of colour more often than Caucasian people.

The study was conducted by the National Institute of Standards and Technology (NIST), and found that many facial recognition algorithms falsely identified Black and Asian faces more than Caucasian faces.

“For one-to-one matching, the team saw higher rates of false positives for Asian and African American faces relative to images of Caucasians. The differentials often ranged from a factor of 10 to 100 times, depending on the individual algorithm,” the report reads.

It also found that Black women are more likely to be misidentified in some matching techniques that are used to find a person related to a criminal investigation.

For instance, Microsoft’s facial recognition system has nearly 10 times more false positives for women of colour than men of colour. The tech giant has said it is reviewing the report from NIST.

Additionally, SenseTime, which is a Chinese AI startup had “high false match rates for all comparisons” in one of the tests.

This report increases already present concerns regarding the use of facial recognition by law enforcement. Earlier this year, the Toronto Police Service faced backlash once it revealed that it uses facial recognition.

With the release of this study, critics are concerned that the technology could lead to unjust harassment or false arrests.

NIST tested 189 algorithms from 99 developers to conduct the study, excluding companies that did not submit a system to the test.

Source: National Institute of Standards and Technology (NIST), Thompson Reuters (CBC) 

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments