Analyst Calls Google Software “Racist”

Writing for Fusion, Charles Pulliam-Moore believes that Google Photos’s software is “racist,” as the algorithm accidentally identified some users as “gorillas.” What most consider as a glitch, Pulliam-Moore sees racism.

His article, “Google Photos identified black people as ‘gorillas,’ but racist software isn’t new,” takes Google to task over what he sees as a recurrence of software companies ignoring diversity.

Pulliam-Moore took particular offense to this tweet, which shows a person identified as gorillas:
Google officially stated, “We’re appalled and genuinely sorry that this happened. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

Not good enough, says Pulliam-Moore. He explains, “As nice as it is of Google to assure us that something like this is a freak instance of coding-gone-wrong, it’s hardly the first time that we’ve seen software show an implicit bias against people of color.”

Then, oddly, Pulliam-Moore brings up the fact that Flickr unveiled a similar program last month, which misidentified both a white woman and a black man. But he ignores that it messed up a white person as well and stuck to his guns about algorithms being racist.

He concludes, “Perhaps if the titans of Silicon Valley hired more engineers of color, things like this wouldn’t happen so often. Or, you know, ever.”


Founder and editor of the Social Memo

  • Facebook
  • Image
    Blogger Comment
    Facebook Comment


Post a Comment