WASHINGTON D.C. — Today, the Supreme Court declined to review the Ninth Circuit’s decision in Enigma Software v. Malwarebytes. TechFreedom filed an amicus brief supporting a grant of cert. The court of appeals’ decision imported a “good faith” requirement into Section 230(c)(2)(B), a little-noticed provision that protects those who offer content filtering tools to others. Despite the narrowness of the case, Justice Thomas issued a 10-page opinion agreeing that the Court should not have taken this case but lambasting what he claims is an overly broad reading of other provisions of the statute. His separate statement largely parallels arguments the Trump Administration and Congressional Republicans have been making all year that Section 230’s protections for content moderation should be narrowed significantly.

“Justice Thomas objected to courts’ ‘rel[ying] on purpose and policy’ when interpreting Section 230, yet that is precisely what the Ninth Circuit did; it’s why the Supreme Court should have taken this case,” said Berin Szóka, Senior Fellow at TechFreedom. “The appeals court read into the statute words that are not there. It makes sense that Congress required websites to prove ‘good faith’ when claiming the (c)(2)(A) immunity for their content moderation decisions. But that requirement makes no sense at all when a filtering tool developer is sued for providing its tool to others to make their own decisions and seeks protection under (c)(2)(B). Letting the Ninth Circuit’s decision stand invites litigation against the makers of anti-malware software, parental controls and other tools that empower users to filter content online. That liability will cause many small developers to exit the market even before they are sued.”

This was an unfortunate act of ‘Ready, fire, aim,’” Szóka continued. “Justice Thomas often issues such statements when the Court decides not to take a case, to express his frustrations about the state of the law. Other justices sometimes do so, too, and there is nothing inherently wrong with such statements. But this is the very first time the Court has ever considered reviewing any case involving Section 230. The briefs in this case did not even address the issues Justice Thomas raises. Justice Thomas is free to call for fuller briefing on Section 230’s meaning in, as he says, ‘an appropriate case,’ but this is not that case. Justice Thomas had no need to express his own views, in extensive dicta, without the benefit of the briefing he acknowledges is needed.”

The Malwarebytes decision involved only the interplay between (c)(2)(A) and (c)(2)(B), not the interplay between (c)(2)(A) and (c)(1) or the meaning of (c)(1). Justice Thomas argues that “both provisions in §230(c) most naturally read to protect companies when they unknowingly decline to exercise editorial functions to edit or remove third-party content, §230(c)(1), and when they decide to exercise those editorial functions in good faith, §230(c)(2)(A).” Section 230(c)(1) ensures that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Section 230(c)(2)(A) protects them for “any action voluntarily taken in good faith” to moderate content they consider objectionable.

Justice Thomas’s opinion tracks political talking points advanced by this White House about the meaning of Section 230,” continued Szóka. “Those theories are framed in textualist terms, but they quickly break down upon close examination — as Justice Thomas himself might ultimately agree if he waited to hear from both sides. Across the board, Section 230 protects the same thing the First Amendment does: editorial discretion. That’s why Congress said website operators cannot be held liable ‘as publishers’ for content they in no way created. Refusing to carry content one finds objectionable is a core function of any publisher. The courts have interpreted the statute correctly — as a way to short-circuit expensive litigation and thus avoid what one appeals court called ‘death by ten thousand duck-bites.’ Instead of being able to resolve lawsuits over content moderation with a motion to dismiss, Justice Thomas’s interpretation would effectively force websites to litigate lawsuits through discovery — which, on average, accounts for up to 90% of the costs of litigation. That, in turn, will discourage content moderation.” “The central purpose of Section 230 was to avoid the Moderator’s Dilemma: Congress wanted to ensure that websites weren’t discouraged from trying to clean up harmful or illegal content,” concluded Szóka. “If, as Justice Thomas argues, Section 230(c)(1) doesn’t protect websites from being held liable as distributors for content they knew, or should have known, was illegal, this liability will create a perverse incentive not to monitor user content — another version of the Moderator’s Dilemma. Holding websites liable for content they edit in any way, as Justice Thomas proposes, could, conversely, discourage websites from attempting to make hard calls, such as by blotting out objectionable words, including racial epithets, while leaving other content up. They may simply take down content entirely.”

###

See more of our work on free speech and Section 230 on our website, including: 

  • Our comments and reply comments on the NTIA’s petition asking the FCC to rewrite Section 230
  • Our coalition letter explaining the constitutional and practice problems raised by the EARN IT Act
  • Our press release on the previous draft of The EARN IT Act (March 5, 2020)
  • Our post DOJ Section 230 Workshop blog posts on Techdirt: Part I, Part II, Part III
  • Our comments and reply comments on the NTIA’s petition asking the FCC to rewrite Section 230
  • A coalition letter by 27 civil society organizations and 53 academics a set of seven principles to guide conversation about amending Section 230 of the Communications Decency Act of 1996
  • Our Twitter thread breaking down the White House Executive Order on Section 230
  • President Berin Szóka’s testimony before the House Judiciary Committee on the filtering practices of social media platforms
  • Our statement on the passage of SESTA
  • Our statement on the takedown of Backpage and its implications for Section 230 and recent sex trafficking legislation

</>