Twitter has been drawing a lot of criticism over an algorithm that decides how photos are cropped and how, they will be displayed on users’ timelines. The algorithm has been accused of being biased and preferring the faces of white people over people with darker skin in the preview. Twitter has also acknowledged the issue with the algorithm and they have promised to rework on it.
People discovered the bias over the weekend when a number of Twitter users posted photos featuring a Black person and a white person. It was spotted that Twitter’s preview of the photo in users’ timeline gave a lot of preference to the white person.
Twitter user Kim Sherrell showed in in his experiment that different permutations and combinations and came to the conclusion that the algorithm was biased. In one of the tweets, he even compared the photos which featured featuring Barack Obama and Mitch McConnell. During this test, Twitter preferred McConnel over Obama. Also, the algorithm did not work when inverted colours were used in the image.
Researcher Matt Blaze spoke about the bias of algorithm and how, it is dependent on the official Twitter app used by a user. On the other hand, Tweetdeck offered more neutral results.
Twitter’s Liz Kelly was rather quick to respond to the bias of the algorithm. She said that the company will open source the revaluated work and it can they be reviewed and replicated to all the other users.
“Thanks to everyone who raised this. we tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do. we’ll open source our work so others can review and replicate,” she said in a tweet.
It needs to be mentioned here that Twitter engineer Zehan Wang said that the company had conducted bias studies before its release in 2017.