Over the weekend, the result of a social experiment carried out by some Twitter users triggered the internet.
In the experiment, a Twitter user @bascule tweeted:
“Trying a horrible experiment… Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama?“
Added to the tweet are two long rectangular images.
The first one has the picture of US Senate majority leader Mitch McConnell (a white man) at the top, followed by a long white space in the middle, and then, the picture of former US president, Barack Obama (a black man), at the bottom.
The second picture flipped the position of McConnell and Obama.
In the preview of the tweet, Twitter algorithm shows the picture of Mitch McConnell on both image columns.
The experiment was then carried out with different images, and facial expressions, and most of the times, Twitter algorithm previews the face of the white man, over the black man.
This caused a spark on the internet over the weekend, as many users accused Twitter of racial bias.
According to a report by Verge, the informal testing began after a Twitter user tweeted about a problem he noticed in Zoom’s facial recognition, which was not showing the face of a Black colleague on calls. When he posted to Twitter, he noticed it too was favouring his white face over his black colleague’s face.
According to a post by Machine learning researchers, when Twitter began using the automatic image cropping preview, it used facial recognition to train the machine.
“Previously, we used face detection to focus the view on the most prominent face we could find. While this is not an unreasonable heuristic, the approach has obvious limitations since not all images contain faces. Additionally, our face detector often missed faces and sometimes mistakenly detected faces when there were none. If no faces were found, we would focus the view on the center of the image. This could lead to awkwardly cropped preview images.“
Liz Kelly, a spokesperson at Twitter, expressed that Twitter tested its system, but they couldn’t find any evidence of racial or gender bias. She however, added that this only shows that there is need for more testing.
Liz Kelly also said that Twitter will keep its work on the algorithm open-source so that other developers can learn from what they find.