Section 230: Emergent Reforms - Fordham Intellectual Property, Media & Entertainment Law Journal
27490
post-template-default,single,single-post,postid-27490,single-format-standard,ajax_fade,page_not_loaded,,select-theme-ver-3.3,wpb-js-composer js-comp-ver-6.6.0,vc_responsive
 

Section 230: Emergent Reforms

Section 230: Emergent Reforms

Section 230 of the Communications Decency Act (“Section 230”) has been the topic of much discussion recently. In November, this journal’s blog covered the FCC’s plans to initiate a rulemaking process centered around the interpretation of the statute.1 Since then, previous efforts at legislative reform have continued along,2 and new efforts have been initiated.3

The recent debate has centered in large part around subsection 230(c) in particular. This provision provides protection for intermediaries that engage in “‘Good Samaritan’ blocking and screening of offensive material.”4 The law arose in response to the case of Stratton Oakmont v. Prodigy.5 There, Prodigy, an online service provider, made specific efforts to filter offensive content from their platform in order to create a family-friendly space.6 However, despite these attempts, an allegedly defamatory post was made on Prodigy’s platform.7 Prodigy was subsequently sued in New York state court by the target of the statement and was ultimately held liable as a publisher.8 To prevent similarly situated intermediaries from being held liable for harmful content that happened to slip through the cracks, Section 230(c) was enacted.9 However, today some find that Section 230(c)’s protections have been read too broadly in that they provide sweeping immunity to intermediaries who bear little resemblance to the “Good Samaritan” the subsection’s title calls to mind.10

In response to these concerns, several reforms to the statue have been proposed.11 The most recent of these is the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act.12 Announced on February 5, 2021, this bill aims to limit CDA immunity for intermediaries in several discrete cases.13 Specifically, it would “make clear” that the immunities afforded by Section 230 do not apply to “ads or other paid content,” “impair enforcement of civil rights laws,” “interfere with laws that address stalking/cyber-stalking or harassment of protected classes,” or bar injunctive relief, wrongful death actions, or suits under the Alien Tort Claims Act.14

The second most recent in time reform is the Protecting Americans from Dangerous Algorithms (PADAA) bill, which was introduced in October of 2020.15 This bill would specifically target an activity the drafters refer to as “Algorithmic Amplification.”16 That is, the way platforms amplify content through their algorithms, allowing it to reach wider audiences than it otherwise would.17 The bill claims to be “narrowly targeted at the algorithmic promotion of content that leads to the worst types of offline harms.”18 For instance, this bill would seek to hold Facebook liable for the role its algorithms allegedly played in the shooting that took place in Kenosha, Wisconsin in August of 2020.19

Third is the Platform Accountability and Consumer Transparency (PACT) Act, which was introduced in June 2020.20 This act functions in part by imposing reporting requirements on intermediaries.21 Under the act, for instance, platforms would be required to “have an acceptable use policy that reasonably informs users about the content that is allowed on platforms and provide notice to users that there is a process to dispute content moderation decisions.”22 In addition, it would “require large online platforms to remove court-determined illegal content and activity within 24 hours.”23 Further, it would explicitly “[e]xempt the enforcement of federal civil laws from Section 230” such that the intermediaries would be unable to use Section 230 as a shield when facing civil actions by federal regulators.24

These recent reform proposals are not without their critics.25 Some overarching concerns relate to the First Amendment,26 burdens placed on smaller tech companies that are less equipped to bear them,27 and whether the “free internet” will suffer without Section 230 as we know it today.28 However, the bipartisan support some of these efforts have seen underscores an equally strong belief that reform of this 1996 statute is needed nonetheless to protect users from powerful intermediaries in the current technological environment.

At this point, the fate of each of these reform measures, or of Section 230 in general, is unclear. Only time will tell how each of these legislative measures will be received.


  1. A.J. Harris, Section 230: Platforms, Plain Meaning, and Projecting the Rulemaking Process to Come, Blog @ IPLJ (Nov. 23, 2020), http://www.fordhamiplj.org/2020/11/23/section-230-platforms-plain-meaning-and-projecting-the-rulemaking-process-to-come/ [https://perma.cc/9U5R-42Q4].

  2. See generally Emily Birnbaum, Tech Legislation to Watch in 2021, protocol, (Jan. 2, 2021), https://www.protocol.com/tech-legislation-2021 [https://perma.cc/M6L9-XN5N].

  3. See generally Reps. Eshoo and Malinowski Introduce Bill to Hold Tech Platforms Liable for Algorithmic Promotion of Extremism, Congress. Anna G. Eshoo (Oct. 20, 2020), https://eshoo.house.gov/media/press-releases/reps-eshoo-and-malinowski-introduce-bill-hold-tech-platforms-liable-algorithmic [https://perma.cc/XA3K-PN5S]; Warner, Hirono, Klobuchar Announce the SAFE TECH Act to Reform Section 230, Mark R. Warner: US Sen. from the Commonwealth of Va., (Feb. 5, 2021), https://www.warner.senate.gov/public/index.cfm/2021/2/warner-hirono-klobuchar-announce-the-safe-tech-act-to-reform-section-230 [https://perma.cc/WM2J-JKDA].

  4. 47 U.S.C. § 230(c).

  5. See Danielle Keats Citron & Benjamin Wittes, The Internet Will Not Break: Denying Bad Samaritans Sec. 230 Immunity, 86 Fordham L. Rev. 401, 404–405 (2017) (explaining the role of Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995) in the history of Section 230.).

  6. See Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 at *2 (N.Y. Sup. Ct. May 24, 1995).

  7. Id.

  8. Id. at *1.

  9. See supra note 5.

  10. See Olivier Sylvain, Intermediary Design Duties, 50 Conn. L. Rev. 203, 213 (2018) (noting “[t]oday, the Good Samaritan, who is supposed to tend to the most vulnerable, plays no role in the courts’ administration of Section 230,”); Mary Anne Franks, How the Internet Unmakes Law, 16 Ohio St. Tech. L.J. 10, 17 (2020) (observing “[s]ubsection (c)(2). . . has been interpreted in ways that are . . . at odds with Good Samaritan laws,”); Citron & Wittes, note 5 supra, at 416 (proposing reform that would limit § 230(c) to actions taken by true Good Samaritans, in contrast to the law’s current, more broad application.).

  11. See supra notes 2-3.

  12. See Mark R. Warner, supra note 3.

  13. See id.

  14. Id.

  15. Cong. Anna G. Eshoo, supra note 3.

  16. See H.R. 8636, 116th Cong., at 2 (2020), https://www.congress.gov/116/bills/hr8636/BILLS-116hr8636ih.pdf.

  17. See Cong. Anna G. Eshoo, supra note 3.

  18. Id.

  19. See id; Complaint at ¶¶ 10-11 Gittings v. Matthewson, No. 2:20-cv-1483, 2020 WL 5726846 (E.D.Wis. filed Sept. 22, 2020) (alleging Facebook enabled “right-wing militias to recruit members and plan events,” and that it “continues to provide militias with the tools to further their violent conspiracies.” (emphasis in original)).

  20. See Schatz, Thune Introduce New Legislation to Update Section 230, Strengthen Rules, Transparency on Online Content Moderation, Hold Internet Companies Accountable for Moderation Practices, Brian Schatz: U.S. Sen. for Haw. (June 24, 2020), https://www.schatz.senate.gov/press-releases/schatz-thune-introduce-new-legislation-to-update-section-230-strengthen-rules-transparency-on-online-content-moderation-hold-internet-companies-accountable-for-moderation-practices[https://perma.cc/CXD6-RBP6].

  21. See id.

  22. Id.

  23. Id.

  24. Id.

  25. See infra notes 26-28.

  26. See, e.g. Daphne Keller, One Law, Six Hurdles: Congress’s First Attempt to Regulate Speech Amplification in PADAA, The Ctr. for Internet and Soc’y (Feb. 1, 2021 at 5:00 A.M.), https://cyberlaw.stanford.edu/blog/2021/02/one-law-six-hurdles-congresss-first-attempt-regulate-speech-amplification-padaa (addressing First Amendment concerns with PADAA) [https://perma.cc/8CNY-VM7S]; Sophia Cope and Aaron Mackey, The PACT Act Is Not The Solution To The Problem of Harmful Online Content, EFF (July 30, 2020), https://www.eff.org/deeplinks/2020/07/pact-act-not-solution-problem-harmful-online-content (addressing First Amendment concerns with the PACT Act) [https://perma.cc/7YAF-UBHY].

  27. See The SAFE TECH Act Frequently Asked Questions, Mark R. Warner: U.S. Sen. from the Commonwealth of Va., https://www.warner.senate.gov/public/_cache/files/b/1/b1369ebf-84a1-48c5-8849-2b1b1e23aab9/2236A5E3DD115F9D299CBCEDB7017F2E.the-safe-tech-faq.pdf.

  28. See generally Aaron Mackey, India McKinney, and Jason Kelley, The SAFE Tech [sic] Act Wouldn’t Make the Internet Safer for Users, EFF (Feb. 25, 2021), https://www.eff.org/deeplinks/2021/02/safe-tech-act-wouldnt-make-internet-safer-users [https://perma.cc/SSP4-QMGL]; see also Mark R. Warner, supra note 27.

Haley Griffin

Haley Griffin is a second-year J.D. candidate at Fordham and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She is also a competitor for the Dispute Resolution Society for Fall 2020, a member of the Brendan Moore Trail Advocacy team, and secretary of the Fordham Art Law Society. She holds a B.A. in English and Physiology from University of Minnesota: Twin Cities.