The Facebook Oversight Board’s Trump Decision: A Missed Opportunity To Lean In.

Julia Schur
6 min readMay 5, 2021

The institutionalization of the Facebook Oversight Board

Facebook’s Oversight Board has issued a decision addressing former President Donald J. Trump’s account being “indefinitely” banned. The Board unfortunately failed to arrive at a firm decision as it relates to the need to permanently ban Trump from the platform, but the decision fortunately gives us insights into the institutionalization of the Board and its power over FB. The Board’s approach disappoints because it did not address the two questions it was tasked with: (1) what should FB do with Trump, and (2) what should FB do with other world leaders who spread disinformation and celebrate violence? The first question was simply sent back to FB and the second was largely unaddressed.

On January 7, 2021, FB, facing the aftermath of the January 6 insurrection, balanced the public interest with individual interests and decided that Trump should not be on their platform. The Board explains it was wrong to “indefinitely” ban Trump because such decision did not align with FB’s existing terms of use. On the one hand, the Board can be applauded for following standards, flagging some of FB’s opaque practices and placing the responsibility on FB itself to decide whether Trump should be banned permanently. The Board decision focuses attention on on FB’s blurry “‘newsworthiness allowance’ which allows content that violates its policies to remain on the platform, if FB considers the content ‘newsworthy and in the public interest,’” even though FB also asserted that it never applied the “newsworthiness allowance to content posted by Trump”.

On the other hand, people expected (or hoped) the Board would help FB arrive at a final answer to the two questions asked. FB failed to hold Trump accountable throughout his Presidency and they bear some responsibility for creating a dangerous product, one that propagated hateful and violent content, without accountability. This decision weighs particularly on those who have been harmed on FB’s platform due to FB’s lack of consistent, reasonable, transparent, and humane decision making. Images of a female-nipple can result in accounts being permanently disabled while angry-white supremacists run free, and jealous exes dox their partners or former-partners, often unpunished, without as little as a warning.

The Board did not take advantage of its position to offer FB tangible options to address Trump’s presence (or lack thereof), now that he is off the platform.

The Board’s local & international influences

The Board bases its decision on: Facebook’s content policies, FB’s values and international human rights law (the Board does not apply local law or U.S. law due to FB’s global reach and the need for an international standard). Any restrictions must meet three requirements: “Rules must be clear and accessible, they must be designed for a legitimate aim, and they must be necessary and proportionate to the risk of harm”.

The Board adds, “First Amendment principles under U.S. law also insist that restrictions on freedom of speech imposed through state action may not be vague, must be important governmental reasons and must be narrowly tailored to the risk of harm.”

This approach in their analysis seems like the Board’s attempt to be less American focused in their approach to content moderation. Tiffany Li, a technology lawyer, legal scholar and Professor at Boston University School of Law, explains in her MSNBC article that the Board has been criticized for being too American and lacking foreign representation in its makeup. She brilliantly writes, “Treating content moderation choices as Supreme Court decisions on free speech is a very American way to think about social media regulation. Perhaps nothing is more American than turning to a private corporation to regulate itself, with no consequences elsewhere.” The Board’s cautious and incomplete decision only reinforces concerns that it is more of a marketing tool, than a force guiding FB to make the right decisions.

Why the Board should have clearly explained how Trump should be permanently banned

There are many reasons to ban an account like Trump’s. Here are a few I wish the Board had emphasized more:

  • His reach and his ability to incite and reward violence is too great. The Board did find that Trump’s January 6, 2021 post violated community standards for praising or supporting people engaged in violence when he wrote: “We love you. You’re very special” in one post and referred to the insurrectionist as “great patriots” and that he would “remember this day forever” in a second post.
  • He has failed to recognize that his conduct was wrong or harmful.
  • The Board recognizes that Trump continues to claim that the election was “stolen” thereby creating an environment likely to create a serious risk of violence.
  • Violent mobs do not dissipate online, they multiply.
  • Facebook has failed to effectively control the spread of harmful content before, and there is every reason to fear it will fail in the future

Part of the Board’s decision tasks FB with further research. For example, the Board asks FB to “undertake a comprehensive review of its potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6, 2021.” This exercise in due diligence risks slowing down the implementation of change, while FB is already on a tight six months deadline. There is abundant proof that FB played a role in what happened on January 6, through its advertising and the viral spread of disinformation.

The Board’s failures

Is the Board’s decision, even if thoughtful and reasoned, a mask hiding a different reality? The Board is a tool FB uses to shield itself in light of the questioning it faced on content its moderation practices. The Board is not the Supreme Court, it is part of FB, a business, and all of its decisions are made in furtherance of that business.

FB fails to provide people with a sense of relief, a sense that help is on the way. This was inevitable. After all, the Board is FB, its members are generously paid, and even with the best of intentions and good faith effort, the second we have judges who are richly paid by those they judge, and are tasked with evaluating a powerful company that failed to regulate itself in the first place, it is hard to see how the impact on human rights and democracy of the rhetoric of figures like Trump will be lessened. This will have an impact not only in America, but also in other countries.

Trump violated the companies’ terms of service, he harassed other users, incited violence, rewarded creeps, created and spread harmful disinformation about democracy and public health. In the words of Danielle Citron and Hany Farid, Trump “was an Olympic-level policy violator — every day brought a new violation, to the detriment of our democracy, national security, and safety. He should not be allowed back.” This is not what we heard from the Board.

The Board is a tool, but it is not the solution. Government regulation and other measures are needed (you guessed it, I am thinking of how Section 230 needs to be amended). Trump’s presence on the platform creates a threat to democracy because he thrives on disinformation, which regulations alone have failed to address, and platforms were slow to address.

The Board’s greatest failure is that it constrains itself in a limited institutional role, they did not offer an alternative reasoning for how to provide restrictions for Trump’s speech, something that courts would offer, and something advisors should do. Platforms can make executive decisions to ban certain folks of their platform. Indeed, it happens all the time. Naturally, Trump is not just anyone, but instead of handling a want-to-be dictator like Trump with white gloves and a caution , he should have been met with a stronger response.

Democracy is fragile, FB and the Board should prioritize protecting fundamental human rights, including the right to the sanctity of elections. The Board may be right to place FB in the position to have the final say, but FB always would have that final say as the Board has no real authority. The Board missed its chance to clean up FB by taking a firm stand and deciding that Trump does not have a place on this platform, now or in the future.

*The views expressed in this article are those of the individual author writing in her individual capacities only — not those of her employer. This article does not constitute legal advice.

--

--

Julia Schur

Lawyer who writes about privacy, civil rights, intellectual property and technology.