This article was published on January 7, 2022

One year after the Capitol Riot, Americans still see two very different Facebooks

Data from The Markup’s Citizen Browser shows the platform’s partisan divide continues


One year after the Capitol Riot, Americans still see two very different Facebooks Image by: The Markup

By: Colin Lecher and Leon Yin

Originally published on themarkup.org

Even as rioters were storming the United States Capitol one year ago, the seeds of two different narratives of Jan. 6 were already forming on social media.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

In one, seen by Facebook users who voted for Joe Biden and covered by mainstream news outlets like The New York Times, a mob spurred by President Trump was incensed at what they falsely believed to be a stolen election, striking at the heart of the nation in a failed insurrection.

For Trump voters on the platform, seeing the day through the lens of small, hyper-partisan news outlets, the day was blown out of proportion by media outlets hostile to the then-president. Perhaps, some conspiracy theories that percolated online suggested, the mob had been infiltrated by left-wing opportunists eager to frame Trump supporters for the violence.

Last year, in the days after Jan. 6, The Markup took stock of what news Americans were seeing about the riot on Facebook. Using data from our Citizen Browser project, which receives snapshots of news feeds from a panel of hundreds of Facebook users around the country, we saw how very differently news is presented to users of different political leanings on the platform.

Today, little has changed. While the topics of debate on Facebook are slightly different than they were a year ago—COVID-19 vaccines, for one, have dominated the national discussion as of late—both sides remain in different social media realities, largely isolated from the views of other Americans.

Take some of the news stories seen on Facebook just this week by Citizen Browser panelists, as cataloged by Split Screen, a project that tracks, in real time, what content is being shown to Citizen Browser panelists of different demographics and political leanings.

Early in the week, some of our panelists who told us they voted for Joe Biden were likely to have any number of articles from NPR appear in their news feeds. One article noted the frighteningly rapid spread of the COVID omicron variant. Another suggested upgrading to a heavy-duty N95 mask for protection against the virus.

Meanwhile, those who self-identified as Trump voters in our panel were more likely to be shown any of a number of culture-war stories from The Daily Wire, a conservative news outlet. One article quoted billionaire Elon Musk’s opposition to “wokeness,” while another noted fellow conservative outlet Fox News’s continuing dominance in cable-TV ratings.

Another popular link from The Daily Wire was to a petition urging readers to “resist tyranny” by opposing “authoritarian” vaccine mandates from the Biden administration. The page includes a stark call to action: “DO NOT COMPLY.”

And while many Biden voters in our panel were connected to an NPR feature on transgender fans of the Matrix franchise, Trump voters were more likely to see a piece from the conservative Western Journal that wrote approvingly of a man who shot and killed three teenagers allegedly attempting to rob his home.

The divide isn’t limited to news, either. Trump voters in our panel are far more likely to be recommended a Facebook group dedicated to Christian memes and jokes. A popular recommended page for Biden voters? A fan page for scientist Neil deGrasse Tyson.

In the week leading up to Thursday’s anniversary of the riot, news about the continuing fallout from Jan. 6 looked very different to our panelists, depending on whether they supported Trump or Biden, according to a review of Citizen Browser data.

In the past week, 98 of 406 Biden voter panelists in our data saw news stories that mentioned “January 6,” “Jan. 6,” or “capitol.” The one that appeared most frequently, to six panelists, was an article from NBC News about congressional testimony around Ivanka Trump’s actions during the Capitol riot. Other stories shown to multiple Biden voters in our data also focused on the ongoing House of Representatives investigation into the riot. An MSNBC article, for example, reported on text messages recently released by Congress that were sent between Fox News host Sean Hannity and Trump administration officials before the riot.

While news stories about Jan. 6 were seen by only 27 (out of 147 total) Trump voters in our panel in the past week, those stories had clear partisan leanings. One, an opinion piece from the conservative website RedState shown to two panelists, is titled, “Reporters Describe January 6, 2021, Like They Were at Omaha Beach, and It Turns My Stomach.” Two Trump voter panelists this week were also shown a Breitbart article on Republican representatives accusing House Speaker Nancy Pelosi of using the House investigation as cover for Democrats’ failure to secure the Capitol building during the riot.

Facebook (which recently rebranded itself as Meta) did not immediately respond to a request for comment, but Kevin McAlister, a spokesperson for the company, told The Markup last year that “it should not be surprising that people with different partisan leanings saw different news sources on Facebook, just as they do with television, radio, and other forms of media.”

While Citizen Browser is not an exact reflection of what’s shown to the millions of Americans on Facebook, the sample provides a window into trends on the platform that are otherwise impossible to see.

For years, researchers have raised concerns that Facebook’s algorithm may be polarizing its users through a so-called “filter bubble” effect. By continuously showing users content that reinforces views they already hold, the (still-contested) idea goes, those users are stuck in self-reinforcing cycles of political news.

“We start to see people burrow in deeper into their preexisting beliefs because of what Facebook suggests that they should follow, and also because of the reinforcement from the community that Facebook offers from these groups,” April Glaser, a senior fellow at the Harvard Kennedy School, told The Markup this week.

Facebook data scientists and Mark Zuckerberg himself have pushed back against that characterization, the latter saying in 2016 that “by giving people access to more information and helping promote diversity and a plurality of opinions, we can build stronger communities.”

Unfortunately, some researchers say, those communities can also be the kind that spread misinformation and fuel political violence.

“Facebook’s governing ideology for many years, and which it has reinforced, has long emphasized freedom of speech over community safety,” Glaser said, “and that becomes really clear when violence erupts, like we saw with Jan. 6.”

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with