Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on July 13, 2021

Surprise, surprise: Dunking on your enemies boosts social media engagement

Emotional language has a lesser impact


Surprise, surprise: Dunking on your enemies boosts social media engagement

A new study has further exposed the effect of polarizing content on social media engagement.

Researchers from the universities of Cambridge and New York found that tweets and Facebook posts about opposing political parties are far more likely to be shared than content containing emotional language.

Up front: Social media platforms have long been accused of algorithmically promoting divisive content, because it’s more likely to go viral and thereby attract ad revenue.

The researchers sought to quantify how dunking on one’s enemies can maximize this engagement.

The team analyzed around 2.7 million posts from news media account and US congressional members.

They found that posts about political opponents were shared roughly twice as often as those about one’s own party. In addition, each extra word about an adversary — such as “Democrat” or “Leftist” in a post by a Republican — increased the odds of a share by 67%.

Notably, emotional language was far less likely to boost engagement.

Per the study paper:

Each individual term referring to the political out-group increased the odds of a social media post being shared by 67%. Out-group language consistently emerged as the strongest predictor of shares and retweets: the average effect size of out-group language was about 4.8 times as strong as that of negative affect language and about 6.7 times as strong as that of moral-emotional language

The researchers warn that amplifying this rhetoric can incite real-world violence, such as the storming of the US Capitol in January.

Quick take: It’s not surprising that dunking on opponents is a strong driver of virality. Social media platforms are acutely aware of the effect.

The Wall Street Journal reported last year that Facebook researchers had warned the company that their “algorithms exploit the human brain’s attraction to divisiveness.” Company executives allegedly shut the research down and declined to implement changes proposed by the team.

More intriguingly, the study found that “emotional language was weakly associated with the various Facebook reactions.” This suggests that out-group identity language is a stronger predictor of engagement than emotional words alone.

HT — The Washington Post

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with