This article was published on September 3, 2020

Facebook doesn’t fix its shit until there’s a controversy — that needs to change


Facebook doesn’t fix its shit until there’s a controversy — that needs to change

Earlier today, Facebook removed the account of T. Raja Singh, a leader of India’s ruling Bhartiya Janta Party (BJP), for hateful speech against Muslim minorities.

This step came after the Wall Street Journal (WSJ) pointed out the company’s policy executive, Ankhi Das, stopped moderators from removing hateful posts and accounts of politicians of BJP. More reporting outlined how Das and Facebook have sided with the Indian government over the years.

Earlier this week, the company’s executives also appeared in front of a parliamentary committee, where both sides of the aisle questioned its executives about hate speech and misuse prevention on the platform.

It took Facebook constant reporting from the media, internal protests, letters from ministers, and a parliamentary hearing to ban a politician that violated its hate speech rules.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

After multiple stories in India about how Facebook executives courted politicians for years, the company can’t even pretend to not understand the culture.

Does Facebook know better? For sure. Does it want to be proactive and take action? Its actions suggest that’s not the case.

The biggest social network in the world has done it time and time again. Take no action against hate speech despite getting reported by numerous people.

There’s a well-defined cycle. Wait for media reports to emerge. Issue statements of apology. Take some minor rectifying actions. Rinse. Repeat.

From Cambridge Analytica to its problems with violence-inciting posts in Myanmar, Facebook has repeatedly opted to address conflicts only after outrage ensues. And more often than not the higher management at the company has refused to take a definitive step during these crises.

Maria Ressa, a journalist from Philippines, contacted Facebook about fake news, harassment, and abuse in 2016, and was ignored for the longest time despite having met top company executives including Zuckerberg. The list goes on. 

In March, a report from WSJ suggested that Facebook knew that its algorithm is spreading extremist content, but did next to nothing to stop the polarization.

Credit: Donald Trump/Twitter
Mark Zuckerberg and Donald Trump

This year, while Twitter has proactively banned misinformative and hateful posts by Donald Trump, Facebook kept some of them up and took a long time to take a few down. And despite employee outrage, Zuckerberg and co. maintained that stance giving the age-old reason of free speech.

There’s no doubt that hate speech is a complicated problem and there might be many posts that would have ambiguities. However, time and time against Facebook has demonstrated that they’ve waited for too long to take definitive action. Moderation at scale might be a difficult issue, but the company needs a policy refinement that is attuned to the current political climate so that there are fewer outlier cases.

Did you know we have an online event about product design coming up? Join the Sprint track at TNW2020 to explore the latest trends and emerging best practices in product development.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top