Following the January 6 riot at the Capitol, Facebook banned Donald Trump from Facebook and Instagram. The day after Trump left office, Facebook referred that decision to its new Oversight Board. The Board is a body of lawyers, academics, and human-rights activists empowered by Facebook to review, and issue binding rulings affirming or reversing, certain of Facebook’s content-moderation decisions.

Today, the Board ruled that Trump should remain banned from Facebook and Instagram—although it also asks “that Facebook apply and justify a defined penalty.” “The posts in question violated the rules of Facebook and Instagram that prohibit support or praise of violating events, including the riot that was then underway at the U.S. Capitol,” wrote the Board. But “it is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.”

“The Board’s ruling confirms the obviousthat Trump gloried in violence on January 6,” said Corbin K. Barthold, Internet Policy Counsel at TechFreedom. “And the upholding of his ban from Facebook’s sites is completely justified. The evidence is clear: Donald Trump is an online misinformation super-spreader. If anything, the Board had good reason to go further than it did, and maintain Facebook’s authority to make the ban indefinite.”

The good news, for those who disagree with the Board’s decision, is that the government had no hand in it,” Barthold continued. “Because government regulation of speech is, by and large, unconstitutional under the First Amendment, the government has very little say in what speech is or is not acceptable online. We should all be happy about that. Content moderation is, at its core, about applying intensely subjective values to an infinite variety of distinct contexts — like editing in a newsroom, or overseeing a parade, only at a vastly larger scale. Like newspapers and parades, each website has the right to make these value judgments for itself.”

“Although we applaud the Board’s ruling,” Barthold concluded, “we recognize that content moderation is an evolving process, of which the Board is just one component. Facebook reviews two million pieces of content a day. In the middle two quarters of 2019, it removed 4.28 billion pieces of content, received 40.9 million appeals of those removals, and, in response to the appeals, made 10.1 million restorations. A twenty-member board that issues deliberative, adjudicatory-style decisions will only ever be one part of a much larger content-moderation process. Both Facebook and other platforms are going to continue to experiment with how to protect speech online while also curbing abhorrent content. We welcome that experimentation.” 

###

Read our related work:

</>