The Facebook Oversight Board’s decision to maintain the former president’s ban won’t change much (especially for the time being)
On May 5, 2021, Facebook’s Oversight Board upheld Facebook’s January 2021 ban of former president Trump from Facebook and Instagram for encouraging violence following January 6, 2021, U.S. Capitol siege, based on two posts Trump made on Instagram and Facebook that afternoon and evening calling the rioters “great patriots” and “very special.” Facebook had requested the input of the 20-member Board as to whether its decision was correct and sought recommendations on future suspensions of political leader users. However, the Board left open the possibility for Trump to return to Facebook, claiming that his indefinite suspension was “standardless” and giving Facebook six months to modify its penalties against Trump proportionately based on its existing platform rules. “Facebook needs to review the matter itself”, the board wrote, and “determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform.”
The Board’s decision attracted a range of positions. Some think it is deflecting responsibility back to Facebook, and others wish clearer standards, at a minimum imposing the Board’s minority view that Facebook actively should seek to prevent the repetition of the adverse human rights effects. There are groups that criticize the decision as an example of Facebook’s enormous power over public expression. In any case, the true force of the Board’s decision will remain unknown until Facebook decides in six months on the new standards for content moderation based on the Board’s recommendations (if at all). It is expected that such a decision may reverberate through the entire Internet ecosystem.
In the whirlpool of opinions views about the Board itself and its competence were not omitted as well. The creation of the Board itself “was essentially a media or PR campaign,” argues Joan Donovan, research director at the Harvard Kennedy School’s Shorenstein Center on Media, Politics, and Public Policy. The Board’s approach means that Facebook has been tasked with deciding for itself how to apply its own policies.
“When it comes to Facebook, you have to remember that Facebook isn’t just a place where people post messages,” says Donovan. Facebook is an organizing tool and broadcast network in one and its power is constantly used for good or for bad.
The repercussions of the Board decision in Trump’s case can be manifold. One of the directions is how it will impact Facebook ‘s own activities. Firstly, it may catalyze the company’s human rights’ protection policy. In its review, the Board agreed with Facebook that Trump’s two posts violated Facebook’s Community Standards and Instagram’s Community Guidelines against praise or support of those engaged in violence. Additionally, by continuing with his unsupported narrative of electoral fraud and calling people to action during a context of “immediate risk of harm,” when there were already protestors at the Capitol, Trump’s posts created a “serious risk of violence.” However, the Board found that, rather than imposing an indefinite suspension, Facebook must “apply and justify a defined penalty” based on its rules for severe violations and the risk of future harm. A minority of the Board believed Facebook should go further ensuring users who later seek reinstatement of their account actively recognize their wrongdoing and commit to following the rules in the future.
The Board emphasized that newsworthiness considerations should not be a priority when there is urgent action needed to prevent significant harm and that there should not necessarily be a distinction between the treatment of political leaders and other influential users. What the Board stated next was that Facebook rules needed transparency and the company should publicly explain its rules when it imposes account-level sanctions against influential users. High government officials’ accounts should be suspended for a limited time if they have repeatedly created posts that pose a risk of harm to international human rights norms. The Board also found that Facebook should escalate political speech from influential users to specialized staff members who are familiar with the speech’s context but insulated from undue influence and from political and economic interference. This points to the need of distinguishing the accountability of public figures in political debate and reminds of the rules valid for traditional media.
The Board’s decision will undeniably spur continued discussions over a private company’s ability to remove user content and, importantly, who gets to decide whether to remove it. It will feed into the recent exchange on amending the US legislation favorable for platform and service providers. The Board’s decision will have far-reaching implications on Section 230 of the Communications Decency Act, which has become a hot issue for Congress and media specialists. The ability of Facebook to ban a political leader’s speech raises interesting First Amendment questions under the hypothesis about a private actor who regulates the speech of political leaders.
Let us cast a look now on the side of the culprit. The night before the Facebook Oversight Board decided to stand by the company’s decision to ban him from its platforms, former President Donald Trump announced that he’d created a website. Called “From the Desk of Donald Trump” it resembled a social media site but was factually a feed of Trump’s statements. It was done in anticipation of the oversight board’s announcement that it was upholding Facebook’s ban on Trump.
For years, Trump was at the center of public attention for his rampant conduct on social media: a head of state was using his personal Twitter account to amplify extremist content, promote dangerous conspiracy theories, and speak directly to followers, who in the end were willing to storm the Capitol to try to overturn an election they falsely believed was stolen. Companies like Facebook and Twitter refrained from interfering in Trump’s social media posts, claiming their “newsworthiness” should keep him protected even when he broke platform rules on abuse or disinformation. That frivolous attitude began to change during the Covid pandemic since Trump used his account to repeatedly spread misinformation about both voting and the virus. At the same time, he threatened to abolish Section 230, the rule that shields many Internet companies from liability for users’ generated content.
No matter what measures Facebook will take the communication cycle will repeat. Trump will continue to issue statements, and they will be shared by his supporters, and covered by the media whether or not he is on social media. The networked attention cycle that revolved around him for so long will continue without him being right in the center, as will the underlying structures that make Trump’s ( or any other political leader’s) influential presence on social media possible.
Within the deadline of six months from now, the Board has set the public will no doubt have another news cycle about Trump’s presence on social media. And this will prove the vicious circle Facebook and other social companies are placed in.
Compiled by Media 21 Foundation from: https://www.technologyreview.com/2021/05/05/1024595/trump-facebook-ban-oversight-board/?truid=c992be3bd0ebe1bf15c4b3cf19833dcf&utm_source=the_download&utm_medium=email&utm_campaign=the_download.unpaid.engagement&utm_term=&utm_content=05-06-2021&mc_cid=488cc0d468&mc_eid=cfba151a51