Itemoids

News Feed

Facebook Doesn’t Want Attention Right Now

The Atlantic

www.theatlantic.com › technology › archive › 2024 › 11 › meta-election-policy-2024 › 680532

After the 2016 elections, critics blamed Facebook for undermining American democracy. They believed that the app’s algorithmic News Feed pushed hyperpartisan content, outright fake news, and Russian-seeded disinformation to huge numbers of people. (The U.S. director of national intelligence agreed, and in January 2017 declassified a report that detailed Russia’s actions.) At first, the company’s executives dismissed these concerns—shortly after Donald Trump won the presidential election, Mark Zuckerberg said it was “pretty crazy” to think that fake news on Facebook had played a role—but they soon grew contrite. “Calling that crazy was dismissive and I regret it,” Zuckerberg would say 10 months later. Facebook had by then conceded that its own data did “not contradict” the intelligence report. Shortly thereafter, Adam Mosseri, the executive in charge of News Feed at the time, told this magazine that the company was launching a number of new initiatives “to stop the spread of misinformation, click-bait and other problematic content on Facebook.” He added: “We’ve learned things since the election, and we take our responsibility to protect the community of people who use Facebook seriously.”

Nowhere was the effort more apparent than in the launch of the company’s “war room” ahead of the 2018 midterms. Here, employees across departments would come together in front of a huge bank of computers to monitor Facebook for misinformation, fake news, threats of violence, and other crises. Numerous reporters were invited in at the time; The Verge, Wired, and The New York Times were among the outlets that ran access-driven stories about the effort. But the war room looked, to some, less like a solution and more like a mollifying stunt—a show put on for the press. And by 2020, with the rise of QAnon conspiracy theories and “Stop the Steal” groups, things did not seem generally better on Facebook.

[Read: What Facebook did to American democracy]

What is happening on Facebook now? On the eve of another chaotic election, journalists have found that highly deceptive political advertisements still run amok there, as do election-fraud conspiracy theories. The Times reported in September that the company, now called Meta, had fewer full-time employees working on election integrity and that Zuckerberg was no longer having weekly meetings with the lieutenants in charge of them. The paper also reported that Meta had replaced the war room with a less sharply defined “election operations center.”

When I reached out to Meta to ask about its plans, the company did not give many specific details. But Corey Chambliss, a Meta spokesperson focused on election preparedness, told me that the war room definitely still exists and that “election operations center” is just another of its names. He proved this with a video clip showing B-roll footage of a few dozen employees working in a conference room on Super Tuesday. The video had been shot in Meta’s Washington, D.C., office, but Chambliss impressed upon me that it could really be anywhere: The war room moves and exists in multiple places. “Wouldn’t want to over-emphasize the physical space as it’s sort of immaterial,” he wrote in an email.

It is clear that Meta wants to keep its name out of this election however much that is possible. It may marshal its considerable resources and massive content-moderation apparatus to enforce its policies against election interference, and it may “break the glass,” as it did in 2021, to take additional action if something as dramatic as January 6 happens again. At the same time, it won’t draw a lot of attention to those efforts or be very specific about them. Recent conversations I’ve had with a former policy lead at the company and academics who have worked with and studied Facebook, as well as Chambliss, made it clear that as a matter of policy, the company has done whatever it can to fly under the radar this election season—including Zuckerberg’s declining to endorse a candidate, as he has in previous presidential elections. When it comes to politics, Meta and Zuckerberg have decided that there is no winning. At this pivotal moment, it is simply doing less.

Meta’s war room may be real, but it is also just a symbol—its meaning has been haggled over for six years now, and its name doesn’t really matter. “People got very obsessed with the naming of this room,” Katie Harbath, a former public-policy director at Facebook who left the company in March 2021, told me. She disagreed with the idea that the room was ever a publicity stunt. “I spent a lot of time in that very smelly, windowless room,” she said. I wondered whether the war room—ambiguous in terms of both its accomplishments and its very existence—was the perfect way to understand the company’s approach to election chaos. I posed to Harbath that the conversation around the war room was really about the anxiety of not knowing what, precisely, Meta is doing behind closed doors to meet the challenges of the moment.

She agreed that part of the reason the room was created was to help people imagine content moderation. Its primary purpose was practical and logistical, she said, but it was “a way to give a visual representation of what the work looks like too.” That’s why, this year, the situation is so muddy. Meta doesn’t want you to think there is no war room, but it isn’t drawing attention to the war room. There was no press junket; there were no tours. There is no longer even a visual of the war room as a specific room in one place.

This is emblematic of Meta’s in-between approach this year. Meta has explicit rules against election misinformation on its platforms; these include a policy against content that attempts to deceive people about where and how to vote. The rules do not, as written, include false claims about election results (although such claims are prohibited in paid ads). Posts about the Big Lie—the false claim that the 2020 presidential election was stolen—were initially moderated with fact-checking labels, but these were scaled back dramatically before the 2022 midterms, purportedly because users disliked them. The company also made a significant policy update this year to clarify that it would require labels on AI-generated content (a change made after its Oversight Board criticized its previous manipulated-media policy as “incoherent”). But tons of unlabeled generative-AI slop still flows without consequence on Facebook.

[Read: “History will not judge us kindly”]

In recent years, Meta has also attempted to de-prioritize political content of all kinds in its various feeds. “As we’ve said for years, people have told us they want to see less politics overall while still being able to engage with political content on our platforms if they want,” Chambliss told me. “That’s exactly what we’ve been doing.” When I emailed to ask questions about the company’s election plans, Chambliss initially responded by linking me to a short blog post that Meta put out 11 months ago, and attaching a broadly circulated fact sheet, which included such vague figures as “$20 billion invested in teams and technology in this area since 2016.” This information is next-to-impossible for a member of the public to make sense of—how is anyone supposed to know what $20 billion can buy?

In some respects, Meta’s reticence is just part of a broader cultural shift. Content moderation has become politically charged in recent years. Many high-profile misinformation and disinformation research projects born in the aftermath of the January 6 insurrection have shut down or shrunk. (When the Stanford Internet Observatory, an organization that published regular reports on election integrity and misinformation, shut down, right-wing bloggers celebrated the end of its “reign of censorship.”) The Biden administration experimented in 2022 with creating a Disinformation Governance Board, but quickly abandoned the plan after it drew a firestorm from the right—whose pundits and influencers portrayed the proposal as one for a totalitarian “Ministry of Truth.” The academic who had been tasked with leading it was targeted so intensely that she resigned.

“Meta has definitely been quieter,” Harbath said. “They’re not sticking their heads out there with public announcements.” This is partly because Zuckerberg has become personally exasperated with politics, she speculated. She added that it is also the result of the response the company got in 2020—accusations from Democrats of doing too little, accusations from Republicans of doing far too much. The far right was, for a while, fixated on the idea that Zuckerberg had personally rigged the presidential election in favor of Joe Biden and that he frequently bowed to Orwellian pressure from the Biden administration afterward. In recent months, Zuckerberg has been oddly conciliatory about this position; in August, he wrote what amounted to an apology letter to Representative Jim Jordan of Ohio, saying that Meta had overdone it with its efforts to curtail COVID-19 misinformation and that it had erred by intervening to minimize the spread of the salacious news story about Hunter Biden and his misplaced laptop.  

Zuckerberg and his wife, Priscilla Chan, used to donate large sums of money to nonpartisan election infrastructure through their philanthropic foundation. They haven’t done so this election cycle, seeking to avoid a repeat of the controversy ginned up by Republicans the last time. This had not been enough to satisfy Trump, though, and he recently threatened to put Zuckerberg in prison for the rest of his life if he makes any political missteps—which may, of course, be one of the factors Zuckerberg is considering in choosing to stay silent.

Other circumstances have changed dramatically since 2020, too. Just before that election, the sitting president was pushing conspiracy theories about the election, about various groups of his own constituents, and about a pandemic that had already killed hundreds of thousands of Americans. He was still using Facebook, as were the adherents of QAnon, the violent conspiracy theory that positioned him as a redeeming godlike figure. After the 2020 election, Meta said publicly that Facebook would no longer recommend political or civic groups for users to join—clearly in response to the criticism that the site’s own recommendations guided people into “Stop the Steal” groups. And though Facebook banned Trump himself for using the platform to incite violence on January 6, the platform reinstated his account once it became clear that he would again be running for president

This election won’t be like the previous one. QAnon simply isn’t as present in the general culture, in part because of actions that Meta and other platforms took in 2020 and 2021. More will happen on other platforms this year, in more private spaces, such as Telegram groups. And this year’s “Stop the Steal” movement will likely need less help from Facebook to build momentum: YouTube and Trump’s own social platform, Truth Social, are highly effective for this purpose. Election denial has also been galvanized from the top by right-wing influencers and media personalities including Elon Musk, who has turned X into the perfect platform for spreading conspiracy theories about voter fraud. He pushes them himself all the time.

In many ways, understanding Facebook’s relevance is harder than ever. A recent survey from the Pew Research Center found that 33 percent of U.S. adults say they “regularly” get news from the platform. But Meta has limited access to data for both journalists and academics in the past two years. After the 2020 election, the company partnered with academics for a huge research project to sort out what happened and to examine Facebook’s broader role in American politics. It was cited when Zuckerberg was pressed to answer for Facebook’s role in the organization of the “Stop the Steal” movement and January 6: “We believe that independent researchers and our democratically elected officials are best positioned to complete an objective review of these events,” he said at the time. That project is coming to an end, some of the researchers involved told me, and Chabliss confirmed.

The first big release of research papers produced through the partnership, which gave researchers an unprecedented degree of access to platform data, came last summer. Still more papers will continue to be published as they pass peer review and are accepted to scientific journals—one paper in its final stages will deal with the diffusion of misinformation—but all of these studies were conducted using data from 2020 and 2021. No new data have or will be provided to these researchers.

When I asked Chambliss about the end of the partnership, he emphasized that no other platform had bothered to do as robust of a research project. However, he wouldn’t say exactly why it was coming to an end. “It’s a little frustrating that such a massive and unprecedented undertaking that literally no other platform has done is put to us as a question of ‘why not repeat this?’ vs asking peer companies why they haven't come close to making similar commitments for past or current elections,” he wrote in an email.

The company also shut down the data-analysis tool CrowdTangle—used widely by researchers and by journalists—earlier this year. It touts new tools that have been made available to researchers, but academics scoff at the claim that they approximate anything like real access to live and robust information. Without Meta’s cooperation, it becomes much harder for academics to effectively monitor what happens on its platforms.

I recently spoke with Kathleen Carley, a professor at Carnegie Mellon’s School of Computer Science, about research she conducted from 2020 to 2022 on the rise of “pink slime,” a type of mass-produced misinformation designed to look like the product of local newspapers and to be shared on social media. Repeating that type of study for the 2024 election would cost half a million dollars, she estimated, because researchers now have to pay if they want broad data access. From her observations and the more targeted, “surgical” data pulls that her team has been able to do this year, pink-slime sites are far more concentrated in swing states than they had been previously, while conspiracy theories were spreading just as easily as ever. But these are observations; they’re not a real monitoring effort, which would be too costly.

Monitoring implies that we’re doing consistent data crawls and have wide-open access to data,” she told me, “which we do not.” This time around, nobody will.