Twitter plans hackers’ contest to identify biases in image algorithm

0
80

CUPERTINO
Twitter Inc has said it will launch a competition for computer researchers and hackers to identify biases in its image-cropping algorithm, after a group of researchers previously found the algorithm tended to exclude black people and men.
The competition is part of a wider effort across the tech industry to ensure artificial intelligence technologies act ethically, the social networking company said in a blog post, adding the bounty competition was aimed at identifying “potential harms of this algorithm beyond what we identified ourselves.”
Following criticism in 2020 about image previews in posts excluding black people’s faces, the company said in May a study by three of its machine learning researchers found the algorithm did in fact display “unequal treatment based on demographic differences.”
It favoured women more highly than men by 8 percent, and it favoured photos of white people over black individuals by 4 percent, Twitter said. Within those demographics, white women were more highly favoured by 7 percent compared with black women; and white men were favoured 2 percent more than black men.
The analysis also delved into objectification biases – or the “male gaze” – after people on Twitter identified instances where cropping chose a woman’s chest or legs as a salient feature. However, they “did not find any evidence that the algorithm cropped images of men or women in areas other than their face at a significant rate”.
To determine this, they gave the algorithm 100 randomly chosen images and found that only three centred on bodies over faces. It stated that “when images weren’t cropped at the head, they were cropped to non-physical aspects of the image, such as a number on a sports jersey”.
On Friday, Twitter released publicly the computer code that decides how images are cropped in the Twitter feed, and said participants are asked to find how the algorithm could cause harm, such as stereotyping or denigrating any group of people.