fbpx
News

Twitter investigating possible racial bias in photo preview algorithm

It's looking into why its photo preview appears to favour white faces over Black faces

Twitter logo on a phone

Twitter is going to investigate whether its algorithm has a racial bias, after experiments have shown that it favours faces of white people over Black people.

Users discovered the problem with the algorithm over the weekend. Once they posted images with a Black person’s face and a white person’s face, the photo previews displayed white faces more often.

Twitter users then began to perform more informal tests, and discovered that the preview algorithm chose to display non-Black cartoon characters too.

The social media giant is now looking into why its neural network chooses to display white people’s faces more frequently.

Liz Kelley, who is part of Twitter’s communications team, tweeted that the company “tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do. We’ll open source our work so others can review and replicate.”

Twitter’s chief technology officer Parag Agrawal said that the algorithm needs “continuous improvement” and that he’s eager to learn from the experiments.

Interestingly, the informal testing began on Twitter after a user-outlined a problem that he found with Zoom’s facial recognition technology. He posted screenshots of his Zoom meetings on Twitter to show how Zoom was not showing his Black colleague’s face on calls.

However, once he posted the images on Twitter, he noticed that it too was favouring his white face over his Black colleague’s face.

These recent developments are more disappointing than surprising, since several facial recognition technologies and algorithms have been found to be racially biased in certain situations.

Via: The Verge

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments