Supreme Court Declines to Reconsider Foundational Principles of Internet Platform Liability

Alert
|
9 min read

On May 18, 2023, in Twitter, Inc. v. Taamneh, the Supreme Court issued a unanimous opinion declining to impose secondary liability on tech companies for allegedly failing to prevent ISIS from using their platforms for recruiting, fundraising, and organizing.1 The Court explained that internet platforms cannot be held secondarily liable under Section 2333 of the Anti-Terrorism Act based solely on broad allegations that they could have taken more aggressive action to prevent terrorists from using their services.2 The holding has potentially significant implications for the current debate concerning the future of Section 230 of the Communications Decency Act, which generally provides immunity for internet platforms with respect to user-generated content.3

Background

On January 1, 2017, Jordanian citizen Nawras Alassaf was killed during a terrorist attack on the Reina nightclub in Istanbul, Turkey.4 The attacker was Abdulkadir Masharipov, who had traveled to Turkey on orders from ISIS to plan and coordinate an attack in Istanbul on New Year's Eve.5 The day after the attack, ISIS released a statement claiming responsibility.6

Members of Alassaf's family (the Taamnehs) filed a lawsuit against Facebook, Inc., Google, Inc., and Twitter, Inc. based on Section 2333 of the Anti-Terrorism Act, which enables U.S. nationals to bring civil suit against any person or entity that "aids and abets" international terrorism by "knowingly providing substantial assistance."7 In brief, the plaintiffs argued that the defendants knew that their platforms played an important role in ISIS's recruiting, fundraising, and organizing efforts, but failed to take appropriate action8 to keep ISIS content off those platforms.9 The plaintiffs asserted that the defendants' inaction – coupled with their provision of services used by ISIS – amounted to aiding and abetting terrorism, such that the defendants should be secondarily liable for the Reina nightclub attack.10

The United States District Court for the Northern District of California dismissed plaintiffs' complaint for failure to state a claim, finding (among other deficiencies) that the plaintiffs had failed to adequately allege that the defendants had provided "substantial assistance" to ISIS by playing a role in any particular terrorist activities.11 The U.S. Court of Appeals for the Ninth Circuit reversed the District Court's decision, finding that the Taamneh family's allegations of aiding and abetting under Section 2333 of the Anti-Terrorism Act were sufficient to survive a motion to dismiss.12 The Supreme Court granted certiorari to decide whether plaintiffs had adequately pled their claims.13

Taamneh Holding

The Supreme Court unanimously held that the plaintiffs' allegations that the defendants aided and abetted ISIS in the terrorist attack on the Reina nightclub failed to state a claim under Section 2333(d)(2) of the Anti-Terrorism Act.14 The crux of the decision is that the provision of a social media platform, standing alone, is insufficient to establish secondary liability for the conduct of bad actors using the platform – even if the platform provider has some general awareness of the bad actors' presence.15 As summarized by the Court:

The mere creation of those platforms, however, is not culpable. To be sure, it might be that bad actors like ISIS are able to use platforms like defendants' for illegal—and sometimes terrible—ends. But the same could be said of cell phones, email, or the internet generally. Yet, we generally do not think that internet or cell service providers incur culpability merely for providing their services to the public writ large. Nor do we think that such providers would normally be described as aiding and abetting, for example, illegal drug deals brokered over cell phones—even if the provider's conference-call or video-call features made the sale easier.16

Utilizing additional comparisons to older forms of communication (e.g., cell phones, email) and citing traditional common law aiding and abetting principles, the Court found that the law requires plaintiffs to establish a concrete nexus between the internet platforms and the specific terrorist attack(s) in question.17 The Court reasoned that imposing secondary liability based on plaintiffs' broad allegations that the defendants were generally aware of the presence of ISIS on their platform – and failed to do enough to prevent terrorists from using their services – would effectively mean that all similarly-situated internet platform providers could potentially be held liable "as having aided and abetted each and every ISIS terrorist attack" anywhere in the world.18

The Court also addressed plaintiffs' allegation that defendants' "recommendation" algorithms – which enable ISIS to more easily reach the users most likely to be interested in their content – go beyond passive aid and constitute active, substantial assistance.19 The Court disagreed with this assertion, finding that the use of algorithms that "appear agnostic as to the nature of the content" matched with users did not convert the defendants' "passive assistance" into "active abetting."20 In sum, the Court rejected the idea that the nature of provider liability should be different due to the use of content-neutral recommendation algorithms.

Gonzalez v. Google LLC

In light of the decision in Taamneh, the Court remanded the related case Gonzalez v. Google LLC to the Ninth Circuit for its reconsideration of the plaintiffs' complaint.21 Like Taamneh, the Gonzalez case was brought by family members of an ISIS attack victim who alleged that Google's purported amplification of ISIS content through recommendation algorithms was sufficient to impose liability for aiding and abetting international terrorism.22 However, whereas the Taamneh case grappled with questions concerning statutory interpretation of the Anti-Terrorism Act and common law tort liability, the key question presented in Gonzalez was whether Section 230 immunity still applies if an online platform makes targeted recommendations of third-party content to its users.

In an unsigned three-page ruling referencing the Taamneh decision, the Court declined to address the scope of Section 230's protections.23 The Court found that there was no need to weigh in on the scope of Section 230 because "much (if not all) of plaintiffs' complaint seems to fail under either our decision in [Taamneh] or the Ninth Circuit's unchallenged holdings below."24 Given the Court's statement that the underlying allegations in Gonzalez are "materially identical" to those in Taamneh, the Ninth Circuit may dismiss the case for failing to state a claim under Section 2333(d)(2) of the Anti-Terrorism Act – leaving the Section 230 question unresolved for now.25 However, Section 230's protections for internet platforms will be subject to ongoing scrutiny, and future cases could lead to circuit splits. Eventually, the need for federal uniformity on these issues may force the Court's hand.

Implications

Although Taamneh leaves the door open for imposing liability in scenarios where there is a concrete nexus between the conduct of an internet platform provider and a specific criminal act carried out by a user, it presents a strong case against a wholesale reassessment of the general liability-shielding principles established in Section 230. Indeed, the Taamneh decision, considered in tandem with Gonzalez, suggests that the Court will be hesitant to expand the nature of internet platform liability for user-generated content.

Despite recent bipartisan criticism of tech companies' purported lack of accountability for third-party content published on their services, lawmakers have not found a workable solution. Some analysts26 hypothesized that the Court might impose sweeping changes to Section 230 in the absence of congressional action. However, it now seems that the Court may approach any future cases concerning platform liability carefully, and may avoid making massive changes to the law given the complexities of platform content moderation and the hugely disruptive ramifications that would accompany such a decision.

The Taamneh decision also rejects one of the most common arguments for imposing heightened liability on social media (as compared to older forms of communication such as calls, texts and email): the claim that a provider's use of recommendation systems takes it out of the realm of passive hosting and thrusts it into a position of explicit support for third-party content posted by users. The Court's finding – that employing an "agnostic" recommendation system does not make a social media provider inherently different from an internet or cell service provider – suggests that social media providers will not face liability merely for providing their services to the public at large and failing to aggressively moderate content. Thus, unless Congress takes action to reconsider the foundational principles governing the internet or the Court takes a different approach in a future case concerning the scope of Section 230, the current liability shield for internet service providers is here to stay.

1 Twitter, Inc. v. Taamneh, 598 U. S. ____ (2023).
2 Id. at 24-25.
3 See 47 U.S.C. § 230.
4 Twitter, Inc. v. Taamneh, 598 U. S. ____, 2 (2023).
5 Id.
6 Id.
7 18 U.S.C. § 2333(d)(2).
8 The plaintiffs’ main concerns were around the defendants’ detection and content removal processes. For instance, the plaintiffs argued that the defendants “failed to implement . . . a basic account detection methodology” to prevent ISIS supporters from using their platforms. Twitter, Inc. v. Taamneh, 598 U. S. ____, 5 (2023).
9 Taamneh v. Twitter, Inc., 343 F. Supp. 3d 904, 906-07 (N.D. Cal. 2018).
10 Id. at 908. 
11 Id. at 918.
12 In finding for the plaintiffs, the Ninth Circuit specifically highlighted the complaint’s allegations that (i) “defendants provided services that were central to ISIS’s growth and expansion, and that this assistance was provided over many years”; and (ii) “defendants allowed ISIS accounts and content to remain public even after receiving complaints about ISIS’s use of their platforms.” See Gonzalez v. Google LLC, 2 F. 4th 871, 910 (9th Cir. 2021). The Ninth Circuit found this conduct sufficient to support a claim for aiding-and-abetting liability, notwithstanding its acknowledgment that “the defendants regularly removed ISIS-affiliated accounts and content” and its recognition that caution should be exercised “in imputing aiding-and-abetting liability in the context of an arms-length transactional relationship of the sort defendants have with users of their platforms.” Id.
13 Twitter, Inc. v. Taamneh, 143 S. Ct. 81 (2022).
14 Twitter, Inc. v. Taamneh, 598 U. S. ____, 2 (2023).
15 Id. at 23.
16 Id.
17 The Court notes that “remote” support can give rise to aiding and abetting liability in the right case – however, “the more attenuated the nexus, the more courts should demand that plaintiffs show culpable participation through intentional aid that substantially furthered the tort.” Id. at 20, 30. Accordingly, given the absence of a concrete nexus between the purported support and the Reina nightclub attack, the Court found “the lack of any defendant intending to assist ISIS, and the lack of any sort of affirmative and culpable misconduct that would aid ISIS” fatal to plaintiffs’ claims. Id. at 29.
18 Id. at 25.
19 Id. at 23.
20 In the words of the Court, “As presented here, the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content . . . Once the platform and sorting-tool algorithms were up and running, defendants at most allegedly stood back and watched; they are not alleged to have taken any further action with respect to ISIS.” Id.
21 Gonzalez v. Google LLC, 598 U. S. ____ (2023).
22 Id. at 1-2.
23 Id. at 2-3.
24 Id. at 3.
25 Id. at 2.
26 See, e.g., Birnbaum, E. (2023, February 21). Google, Section 230 are at center of Supreme Court debate over Big Tech. Bloomberg.com. https://www.bloomberg.com/news/articles/2023-02-21/google-section-230-are-at-center-of-supreme-court-debate-over-big-tech#xj4y7vzkg.

White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.

This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.

© 2023 White & Case LLP

Top