That’s “good morning” in Shona
Twitter was originally a text-based social media network.
In its early days, users had to rely on third-party apps like Tweetphoto to upload and share their images. Tweetphoto later became Plixi and was eventually acquired by Lockerz.
Over time, Twitter has been re-engineered to handle images natively. But every once in a while, its users experience the kind of glitches that remind everyone of Twitter’s origins — and that’s as a text-based social media network.
Doesn’t see colour
Twitter timelines are constrained spaces, hence the 140 and, eventually, 280-character limit. To fit images of different dimensions into user timelines, it uses previews.
And to determine what part of an image should be contained in the preview, Twitter uses, or should I say previously used, a cropping tool that works with an algorithm called a saliency model.
Saliency models are trained on how the human eye looks at a picture as a method of prioritizing what’s likely to be most important to most people. It then crops out the rest of the image.
Over time, Twitter users started noticing that the model, while highly effective as a cropping mechanism, had some significant biases.
In a tweet, he posted two photos of the then-US Senate Majority Leader Mitch McConnell and former President Barack Obama. In one image, Obama was featured at the top, in the other, he was at the bottom. However, Twitter’s algorithm cropped both images to show only McConnell in the previews, indicating that it found him more “salient” than Obama.
When experiments like this got to Twitter’s attention, the company ran experiments of its own and came up with several findings, two of them stood out to me:
Demographic differences: In a lineup with both black and white people, white people were often cropped ahead of black people. And in a lineup of people of both genders, men were often picked ahead of women.
Male gaze: The cropping tool picked a woman’s chest or legs as a salient feature.
Following its findings, Twitter has now suspended the use of its image-cropping tool.
about why, because of the bias in today’s most widely-used computer vision systems, it’s too early to deploy the technology for sensitive use cases such as law enforcement and mass surveillance.
However, this is different.
Twitter’s saliency models aren’t being used to make decisions that can impact an individual’s civil liberties. What this episode does, however, is underline the issue of bias.
But there are no easy solutions.
At the first glance, it may seem that the image-cropping tool errors can be articulated purely as a fairness concern.
And researchers have developed technical ways of defining algorithmic fairness, such as requiring that models have equal predictive values across groups or requiring that models have equal false positive and false negative rates across groups.
In the example of the Twitter saliency model, there was a 4% difference in favour of white individuals, fairness implies that the disparity should be approximately 0%.
However, there are limitations in using formalized fairness metrics in making an assessment. Regardless of the statistical results of the analysis, the risk of representational harm is still present; representational harm is a situation that occurs when people underrepresented in a data set do not have the choice to represent themselves as intended.
For example, default camera settings are often not optimized to capture darker skin tones, resulting in lower-quality database images of Black people. This means that while choosing the salient parts of a picture that contains both lighter and darker complected individuals, some form of bias is likely to persist.
The challenge for the entire ML community is, therefore, to find a holistic, long-term way to solving this bias problem. But the challenge for the African ML community, in particular, is to lead this development or at least play an active role.
Working in the African AI/ML space, I often question why the ML equivalent of Lockerz can’t be developed here. It’s a big ask, I know, especially considering that the community is still in its infancy. But being the continent with the largest population of Black people, the bias affects Africans more than anybody else.