Twitter is getting wise to the challenges involved with getting rid of racial bias in algorithms.
The social network’s Liz Kelley explained that the company had more analysis to do after cryptographic engineer Tony Arcieri carried out an experiment suggesting Twitter’s algorithm was biased when it comes to prioritizing photos. For the sake of an instance, when photos of Barack Obama and Mitch McConnell were attached to tweets, Twitter seemed to highlight McConnell’s face, while Obama only showed when Arcieri inverted the colors, making skin color a non-issue.
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Others attempted to reverse photo and change name orders but that was of no avail. A higher-contrast smile worked, Intertheory’s Kim Sherrell discovered. Meanwhile, Scientist Matt Blaze found that the priority changed based on the official Twitter app used, with Tweetdeck being more neutral.
Kelley said Twitter used the algorithm only after checking for bias, but “didn’t find evidence” at the time. Twitter would open source its algorithm studies, allowing others to “review and replicate.”
thanks to everyone who raised this. we tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do. we’ll open source our work so others can review and replicate. https://t.co/E6sZV3xboH
— liz kelley (@lizkelley) September 20, 2020
There is no assurity that Twitter will be able to correct this, but the experiment reveals the real dangers of algorithmic bias regardless of intent. It could put someone out of limelight, even if they are central to a linked news article or social media post. It looks like issues like this won’t be exceptionally rare anytime soon.