For the better half of this year, Donald Trump has continuously shown flashes of governing America like an authoritarian ruling over a failed state: the commander in chief had peaceful protesters tear-gassed right outside of his residence; he commanded violent far-right gangs to “stand by,” presumably to await his future orders; and he knowingly downplayed the deadly nature of a virus weeks before it began a kill streak that has taken 225,000 American lives and counting. After months of these disturbing moments that have resulted in an increasingly polarized electorate, one of the most crucial channels of communication in the U.S. is preparing to take some of the same pre-election precautions only utilized in “at-risk” countries.
With the presidential election less than two weeks away, Facebook is reportedly preparing to take steps to quell conflict that seems likely to arise in the aftermath of November 3—the kind of political turmoil that, in the past, has been stoked on social media before spilling out onto the streets. The social media giant’s set of election contingency policies includes limiting the speed at which viral posts spread among its users; reducing standards for what the platform deems too inflammatory to remain on its site; adjusting what kinds of content appear on users’ timelines; and vigilantly looking out for software that could be used to promote potentially dangerous content, according to a report by the Wall Street Journal. In the past, Facebook used these policies in countries like Sri Lanka and Myanmar—though, in the latter case, the company was accused of allowing its platform to be used in the incitement of genocidal violence against the Rohingya Muslim minority.
Andy Stone, a Facebook spokesperson, explained the company’s election contingency plan in a statement on Sunday. “We’ve spent years building for safer, more secure elections,” he said. “We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”
While Facebook has been criticized by Democratic lawmakers and left-leaning pundits for not doing more to limit the spread of disinformation in the aftermath of the 2016 election, it seems that the company’s primary critics this time around are on the the right. Most recently, the platform was pilloried after it restricted the New York Post’s dubiously sourced report on Hunter Biden’s “laptop from hell.” Last week, Facebook also took down 48 of the Trump campaign’s reelection advertisements that claimed “your vote has not been counted.” Both moves came after Facebook CEO Mark Zuckerberg acknowledged in an Axios interview that his company needs to be “doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election.” Additionally, Zuckerberg recently announced that the platform is prepared to reject election-victory claims made by campaigns before an official decision is declared. But the Facebook founder did note last week that he does not “expect that we continue to adopt a lot more policies that are restricting of a lot more content” once the election and its results are solidified.
Amid one of the more contentious elections in modern U.S. history, a once-in-a-century global pandemic, and months of continuous political unrest, Facebook has taken several steps to contain misinformation and calls for political violence. Over the summer, the company banned a network of accounts affiliated with the anti-government Boogaloo, a right-wing movement of violent accelerationists who were linked to the killings of law enforcement officers. In August, the site also deleted a post by Trump that touted unproven and ineffective cures for COVID-19. And early this month, it took action against one of its worst scourges, finally removing the QAnon pages and groups that had been allowed to fester and multiply unabated for years.
Still, critics worry the tech giant’s blind spots could allow some things to slip through the cracks. Its announcement banning new political advertisements in the week leading up to the election left a pretty big loophole: Ads already approved for that week can still run. And in late August, it was reported that the platform failed to take action around militia group Kenosha Guard, several members of which maintained a presence in Kenosha, Wisconsin, in the aftermath of the police shooting of Jacob Blake, despite receiving at least two reports about the group. (In a comment to The Verge, Facebook said its investigation produced no direct link between Kyle Rittenhouse, who shot two protesters, and the militia group.) The aftermath of the 2020 election will represent a challenge unlike any the company has faced thus far.
More Great Stories From Vanity Fair
— Progressives Are Going Rogue to Flip Pennsylvania for Biden
— White House Reporters Fume Over Team Trump’s “Reckless” COVID Response
— Why Anti–Trump Attack Ads Might Actually Be Helping Him
— Tax Mess Aside, Can Trump Pay Off His $1 Billion in Debt?
— News Media Begins to Contemplate a Post–Trump White House
— The Kimberly Guilfoyle Sexual Harassment Allegations Get Even Darker
— As Trump Falters, Democrats See an Expanding 2020 Senate Map
— From the Archive: Inside Trump’s Twisted, Epic Battle for Mar-a-Lago
— Not a subscriber? Join Vanity Fair to receive full access to VF.com and the complete online archive now.