40268
post-template-default,single,single-post,postid-40268,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Tawainna Anderson v. TikTok: Section 230 Shields TikTok Against Mom’s Lawsuit

Tawainna Anderson v. TikTok: Section 230 Shields TikTok Against Mom’s Lawsuit

Section 230 immunizes participants (whether service providers or individual people) in the Internet ecosystem by removing liability for illegal content posted online by other people.[1] Section 230 has been widely recognized as one of the most important laws contributing to the success of the Internet.[2] However, critics argue its broad protections allow social media companies to ignore real harm to users.[3] This blog post will discuss Tawainna Anderson v. TikTok, Inc. by discussing the facts of the case, the complaint, Section 230 of the Communications Decency Act, and the District Court’s reasoning in granting Defendant’s motion to dismiss.

Facts

TikTok is a social media platform that enables users to create and share short videos.[4] TikTok’s most popular feature is its For You Page (“FYP”).[5] The FYP displays a stream of videos created by other users that are curated by TikTok’s algorithm.[6] The FYP functions by learning the preferences of the user and displaying content tailored to that user.[7] Plaintiff’s daughter’s FYP presented a video of the Blackout Challenge (“Challenge”), wherein individuals strangled themselves with household items.[8] Plaintiff later found her daughter unconscious in the closet hanging from a purse strap. The daughter died after several days in intensive care.[9]

Complaint

Plaintiff brought products liability and negligence claims against TikTok.[10] Plaintiff alleged TikTok and its algorithm “recommend[ed] inappropriate, dangerous, and deadly videos to users” to make them addicted to the platform and that TikTok failed to warn “of the risks associated with dangerous and deadly videos and challenges.”[11] The court granted Defendant’s motion to dismiss, reasoning that Plaintiff’s claims were linked to Defendant’s actions as a publisher, thus implicating Section 230.[12]

Communications Decency Act Section 230

The act immunizes interactive computer services that publish information provided by a third party.[13] Section 230 provides immunity when: (1) the defendant is an interactive computer service provider, (2) the plaintiff is seeking to treat the defendant as a publisher, and (3) the information was provided by another content provider.[14] In the instant case, the only dispute was whether the plaintiff, through defective design and failure to warn claims, was attempting to treat TikTok as the publisher of the Challenge videos.[15] Importantly, in deciding whether Section 230 applies, it matters not what the cause of action is but whether the cause of action requires the court to treat the defendant as the publisher of the third party’s content.[16] Essentially, the court must look to whether the defendant’s alleged duty stems from their “status or conduct as a publisher or speaker.”[17]

Reasoning

The plaintiff contends her claims concern the Defendant’s “deliberate action[s].”[18] However, courts have held that algorithms themselves are not content.[19] The Second Circuit ruled that “tools such as algorithms… are designed to match [] information with a consumer’s interests,” which is a publisher function covered under Section 230.[20]

Duty to Warn

In attempting to impose a duty to warn on TikTok, Plaintiff relied on two Ninth Circuit decisions: Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016) and Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021). In Internet Brands, the plaintiff was a member of the site “Model Mayhem.” Two men posing as talent scouts reached out to the plaintiff for a fake audition, where they then assaulted the plaintiff. The plaintiff alleged that the website’s failure to post warnings of the risks associated with using it violated a duty to warn. Importantly, the plaintiff’s claims in Internet Brands were unrelated to the site’s content. The court in Internet Brands found Section 230 immunity inapplicable because the duty to warn “[did] not arise from an alleged failure to adequately regulate access to user content,” or “affect how [the defendant] publishes or monitors such content.”[21] In Lemmon, the plaintiffs alleged that Snapchat’s speed filter, which allowed users to record their actual speed, “helped cause the high-speed car accident in which plaintiffs’ two sons were killed.”[22] The court in Lemmon found Section 230 inapplicable because the claims were “independent[] of the content” created by Snapchat users.[23] Instead, the alleged defect was related to the design of the platform.[24]

In the instant case, the duty to warn is related to TikTok’s publication of third-party information without a warning disclaimer.[25] Additionally, claims which rely on a service provider’s failure to implement safety measures to protect minors are “merely another way of claiming that [service providers are] liable for publishing the communications.”[26] Thus, the court found Section 230 immunity applicable to the instant case and granted Defendant’s motion to dismiss.

Footnotes[+]

Kevin Johnson

Kevin Johnson is second-year J.D. candidate at Fordham University School of Law. He is a staff member at Fordham’s Intellectual Property, Media & Entertainment Law Journal. He holds a B.S. in Biology from the University of California San Diego. Kevin is currently a member of the Dispute Resolution Society and Brendan Moore Trial Advocacy Center.