How Facebook does (and doesn’t) shape political views

How Facebook does (and doesn’t) shape political views

Today let's talk about some of the most rigorous research we've seen to date on the topic of social networks' influence on politics — and the predictably intense debate surrounding how to interpret them.

I.

Even before 2021, when Francis Huygen rocked the company by releasing thousands of documents detailing its internal research and discussions, Meta faced repeated calls to cooperate with academics on the social sciences. I have argued that it is ultimately in the company's interest to do so, since the absence of good research on social networks has led to strong beliefs around the world that social networks are harmful to democracy. If that's not true — as Meta insists it isn't — the company's best course of action is to enable independent research on that question.

The company long ago agreed to do that, in principle. But it has been a rocky road. The Cambridge Analytica data privacy scandal of 2018, which stemmed from an academic research partnership, has made Meta understandably anxious about sharing data with social scientists. A later project with a nonprofit called Social Science One went nowhere, as Meta took so long to produce data that its biggest supporters quit before producing anything of note. (It was later revealed that Meta had accidentally provided the researchers with bad data, effectively ruining the research in progress.)

Three papers sought to understand how the Facebook news feed affected users’ experiences and beliefs

Despite those setbacks, Meta and researchers continue to find new ways to work together. On Thursday, the first research coming out of this work was published.

Three papers in Science and one paper in Nature attempted to understand how the content of the Facebook news feed affected users' experiences and beliefs. The study analyzed data on Facebook users in the United States from September to December 2020, covering the period during and immediately after the US presidential election.

Kai Kupferschmidt summarized the findings in a piece for Science:

In one experiment, researchers prevented Facebook users from seeing any "shared" posts; in another, they displayed Instagram and Facebook feeds to users in reverse chronological order, instead of in the order of the Meta curation algorithm. Both studies were published in Science. In a third study published in Nature , the team reduced by one-third the number of posts Facebook users saw from "like-minded" sources -- people who share their political leanings.
In each of the experiments, these tweaks changed the kind of content users saw: Removing reshared posts meant people saw far less political news and less news from untrusted sources, for example, but more uncivil content. Replacing the algorithm with a chronological feed resulted in people seeing more untrustworthy content (because the Meta algorithm downgrades sources that repeatedly share misinformation), although hateful and intolerant content was almost halved. Users in the experiments also ended up spending much less time on the platforms than other users, suggesting that they became less persuasive.

By themselves, the findings fail to confirm the arguments of Meta's worst critics, who believe that the company's products have played a leading role in polarizing the United States, threatening democracy. But they also don't suggest that changing the feed in the way that some lawmakers have called for — making it chronological instead of posts ranked according to other cues — will have a positive impact.

"Surveys during and after the experiments show that these differences do not translate into measurable effects on users' attitudes," Kupferschmidt writes. "Participants did not differ from other users in how polarized their views were on issues such as tourist immigration, Covid-19 restrictions or racial discrimination, for example, or in their knowledge of elections, their trust in the media and political institutions, or their trust in the legitimacy of elections. They too were more or less likely to vote in the 2020 elections.”

II.

Against this somewhat confusing background, it is not surprising that a battle has erupted around what conclusions we should draw from the study.

Meta, for his part, suggested that the findings show that social networks have only a limited impact on politics.

"Although questions about the impact of social media on key political attitudes, beliefs, and behaviors are not fully resolved, the empirical findings add to a growing body of research showing that key features of Meta's platforms alone have detrimental 'effective' polarization or meaningful effects on these outcomes," Nick Clegg, the company's president of global affairs, wrote in a blog post: "They also now challenge the common claim that the ability to re-share content on social media drives polarization."

But behind the scenes, as Jeff Horwitz reports in The Wall Street Journal, meta and social scientists are fighting over whether that's true. 

Horwitz writes:

The academic leaders, New York University professor Joshua Tucker and University of Texas at Austin professor Talia Stroud, said that while studies show that simple algorithm changes do not make test subjects less polarized, the paper had caveats and possible explanations. Why such limited changes haven't changed users' overall view of politics in the final months of the 2020 election.

"The findings of these papers do not support all of those assertions," Stroud said. Clegg's comment "is not a statement we would make."

 

Science headlined its package on the study "Wired to Split," which led to this wonderful detail from Horwitz: "Representatives of the publication said that meta and outside researchers had asked to add a question mark to the title to reflect the uncertainty, but that the publication did not understand its research. deems the representation fair."

Megan Phelan, who worked on Package for Science, wrote to Meta earlier this week that the journal's findings do not incriminate the social network, Horowitz reported. "Research findings suggest that meta-algorithms are an important part of what keeps people divided," she wrote.

What to make of all this?

While researchers struggle to draw definitive conclusions, some things seem clear. 

Facebook represents only one facet of the broader media ecosystem

One, as limited as these studies may seem in their scope, they represent some of the most significant efforts to date for a platform to share data like this with outside researchers. And despite valid concerns from many of the researchers involved, in the end Meta did grant them most of the independence they were seeking. That’s according to an accompanying report from Michael W. Wagner, a professor of mass communications at the University of Wisconsin-Madison, who served as an independent observer of the studies. Wagner found flaws in the process — more on those in a minute — but for the most part he found that Meta lived up to its promises.

Two, the findings are consistent with the idea that Facebook represents only one facet of the broader media ecosystem, and most people’s beliefs are informed by a variety of sources. Facebook might have removed “stop the steal”-related content in 2020, for example, but election lies still ran rampant on Fox News, Newsmax, and other sources popular with conservatives. The rot in our democracy runs much deeper than what you find on Facebook; as I’ve said here before, you can’t solve fascism at the level of tech policy.

At the same time, it seems clear that the design of Facebook does influence what people see, and may shift their beliefs over time. These studies cover a relatively short period — during which, I would note, the company had enacted “break the glass” measures designed to show people higher-quality news — and even still there was cause for concern. (In the Journal’s story, Phelan observed that “compared to liberals, politically conservative users were far more siloed in their news sources, driven in part by algorithmic processes, and especially apparent on Facebook’s Pages and Groups.”)

Perhaps most importantly, these studies don’t seek to measure how Facebook and other social networks have reshaped our politics more generally. It’s inarguable that politicians campaign and govern differently now than they did before they could use Facebook and other networks to broadcast their views to the masses. Social media changes how news gets written, how headlines are crafted, how news gets distributed, and how we discuss it. It’s possible that the most profound effects of social networks on democracy lie somewhere in this mix of factors — and the studies released today only really gesture at them.

III.

The good news is that more research is on the way. The four studies released today will be followed by 12 more covering the same time period. Perhaps, in their totality, we will be able to draw stronger conclusions than we can right now.

I want to end, though, on two criticisms of the research as it has unfolded so far. Both come from Wagner, who spent more than 500 hours observing the project over more than 350 meetings with researchers. One problem with this sort of collaboration between academia and industry, he wrote, is that scientists must first know what to ask Meta for — and often they don’t.

“Independence by permission is not independent at all.”

"One shortcoming of industry-academia collaborative research models in general, which is reflected in these studies, is that they do not deeply engage with how complex the data architecture and programming code are in corporations like Meta," he wrote. "Simply put, researchers don't know what they don't know, and the incentives for industry partners to disclose everything they know about their platforms aren't clear."

Another major flaw, he wrote, was that the research was ultimately done on meta-terms rather than on scientific terms. There are some good reasons for this — Facebook users have a right to privacy, and regulators will punish the company harshly if it's violated — but the trade-off is real.

"In the end, freedom by permission is not freedom at all," Wagner concludes. “Rather, it's a sign of things to come in academia: incredible data and research opportunities offered to a select few researchers at the expense of true freedom. Scholarship is not entirely independent when the data is held by for-profit corporations, and it is not independent when those same corporations can limit the nature of its study."