In a case set to be heard by the Supreme Court this term, the highest court will consider Section 230 of the Communications Decency Act (CDA) for the first time in the law’s quarter-century history. Gonzalez v. Google LLC was one of nine new cases added to the Supreme Court’s docket on October 3, 2022.
Section 230, the 1996 provision aimed fostering the development of the Internet and avoiding a chilling effect on speech, grants Internet platforms broad immunity against liability for content published by others. As the capabilities and functionalities of Internet services plunge further and further afield from what could be imagined in the last decade of the twentieth century, the Supreme Court’s determination in Gonzalez v. Google could radically retool tech companies’ ironclad immunity under Section 230.
At the center of Gonzalez v. Google is the family of Nohemi Gonzalez, a 23-year-old student killed in a 2015 ISIS attack in Paris. The suit alleges that Google participated in terrorist recruitment and promotion by hosting videos created by ISIS on its platforms, algorithmically recommending terrorist content to users who would be interested in viewing it, placing paid advertisements in proximity to ISIS-created content, and splitting the resulting ad revenue with ISIS. The plaintiffs filed suit under the Anti-Terrorism Act (ATA), alleging that the defendants were liable for committing acts of international terrorism and secondarily liable for conspiring with, and aiding and abetting, ISIS’s acts of international terrorism.
While conceding that Google’s policies expressly prohibited the content at issue, the plaintiffs argued that the targeted promotion of terrorist content is not covered by Section 230 protections. The Ninth Circuit dismissed the claim, reasoning in part that such content recommendations are part of the editorial process that Section 230 protects.
“A website’s use of content-neutral algorithms, without more, does not expose it to liability for content posted by a third-party. Under our existing case law, § 230 requires this result,” wrote Judge Morgan Christen, representing the majority in an opinion addressing three separate appeals. The lower court also ruled that the plaintiff had not alleged the requirements necessary to establish direct liability for an act of international terrorism.
“In sum, though we agree the Internet has grown into a sophisticated and powerful global engine the drafters of § 230 could not have foreseen, the decision we reach is dictated by the fact that we are not writing on a blank slate,” the majority wrote. “Congress affirmatively immunized interactive computer service providers that publish the speech or content of others.”
Judge Ronald M. Gould dissented in part, arguing that Section 230 should not immunize Google from liability for claims related to its algorithms.
“I do not believe that Section 230 wholly immunizes a social media company’s role as a channel of communication for terrorists in their recruiting campaigns and as an intensifier of the violent and hatred-filled messages they convey,” Gould wrote. “The law should not give social media platforms total immunity, and in my view it does not.”
SCOTUSblog.com defines the issue now at hand as “[w]hether Section 230(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.”
In a filing to the Supreme Court, Google argued that if the Court finds that YouTube recommendations can’t be shielded from legal liability, “section 230 would be a dead letter. This Court should not lightly adopt a reading of section 230 that would threaten the basic organizational decisions of the modern internet.”
Two years ago, Justice Clarence Thomas signaled interest in considering the issue now before the Court. In a statement respecting the Court’s denial of certiorari in Malwarebytes, Inc. v. Enigma Software Group USA, LLC,Thomas suggested that “in an appropriate case, [the Court] should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.”
One of the other appeals addressed jointly by the Ninth Circuit opinion will also be heard by the Supreme Court this term. Twitter, Inc. v. Taamneh will examine “[w]hether a defendant that provides generic, widely available services to all its numerous users and ‘regularly’ works to detect and prevent terrorists from using those services ‘knowingly’ provided substantial assistance under 18 U.S.C. § 2333 merely because it allegedly could have taken more ‘meaningful’ or ‘aggressive’ action to prevent such use; and (2) whether a defendant whose generic, widely available services were not used in connection with the specific ‘act of international terrorism’ that injured the plaintiff may be liable for aiding and abetting under Section 2333.”
The question of whether Section 230 covers targeted recommendations or only traditional editorial functions stands to overhaul the legal landscape for any Internet service that hosts user-generated content. NRB will continue to monitor the proceeding of Gonzalez as the Supreme Court’s 2022-2023 term unfolds.