27490
post-template-default,single,single-post,postid-27490,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Section 230: Emergent Reforms

Section 230: Emergent Reforms

Section 230 of the Communications Decency Act (“Section 230”) has been the topic of much discussion recently. In November, this journal’s blog covered the FCC’s plans to initiate a rulemaking process centered around the interpretation of the statute.[1] Since then, previous efforts at legislative reform have continued along,[2] and new efforts have been initiated.[3]

The recent debate has centered in large part around subsection 230(c) in particular. This provision provides protection for intermediaries that engage in “‘Good Samaritan’ blocking and screening of offensive material.”[4] The law arose in response to the case of Stratton Oakmont v. Prodigy.[5] There, Prodigy, an online service provider, made specific efforts to filter offensive content from their platform in order to create a family-friendly space.[6] However, despite these attempts, an allegedly defamatory post was made on Prodigy’s platform.[7] Prodigy was subsequently sued in New York state court by the target of the statement and was ultimately held liable as a publisher.[8] To prevent similarly situated intermediaries from being held liable for harmful content that happened to slip through the cracks, Section 230(c) was enacted.[9] However, today some find that Section 230(c)’s protections have been read too broadly in that they provide sweeping immunity to intermediaries who bear little resemblance to the “Good Samaritan” the subsection’s title calls to mind.[10]

In response to these concerns, several reforms to the statue have been proposed.[11] The most recent of these is the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act.[12] Announced on February 5, 2021, this bill aims to limit CDA immunity for intermediaries in several discrete cases.[13] Specifically, it would “make clear” that the immunities afforded by Section 230 do not apply to “ads or other paid content,” “impair enforcement of civil rights laws,” “interfere with laws that address stalking/cyber-stalking or harassment of protected classes,” or bar injunctive relief, wrongful death actions, or suits under the Alien Tort Claims Act.[14]

The second most recent in time reform is the Protecting Americans from Dangerous Algorithms (PADAA) bill, which was introduced in October of 2020.[15] This bill would specifically target an activity the drafters refer to as “Algorithmic Amplification.”[16] That is, the way platforms amplify content through their algorithms, allowing it to reach wider audiences than it otherwise would.[17] The bill claims to be “narrowly targeted at the algorithmic promotion of content that leads to the worst types of offline harms.”[18] For instance, this bill would seek to hold Facebook liable for the role its algorithms allegedly played in the shooting that took place in Kenosha, Wisconsin in August of 2020.[19]

Third is the Platform Accountability and Consumer Transparency (PACT) Act, which was introduced in June 2020.[20] This act functions in part by imposing reporting requirements on intermediaries.[21] Under the act, for instance, platforms would be required to “have an acceptable use policy that reasonably informs users about the content that is allowed on platforms and provide notice to users that there is a process to dispute content moderation decisions.”[22] In addition, it would “require large online platforms to remove court-determined illegal content and activity within 24 hours.”[23] Further, it would explicitly “[e]xempt the enforcement of federal civil laws from Section 230” such that the intermediaries would be unable to use Section 230 as a shield when facing civil actions by federal regulators.[24]

These recent reform proposals are not without their critics.[25] Some overarching concerns relate to the First Amendment,[26] burdens placed on smaller tech companies that are less equipped to bear them,[27] and whether the “free internet” will suffer without Section 230 as we know it today.[28] However, the bipartisan support some of these efforts have seen underscores an equally strong belief that reform of this 1996 statute is needed nonetheless to protect users from powerful intermediaries in the current technological environment.

At this point, the fate of each of these reform measures, or of Section 230 in general, is unclear. Only time will tell how each of these legislative measures will be received.

Footnotes[+]

Haley Griffin

Haley Griffin is a second-year J.D. candidate at Fordham and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She is also a competitor for the Dispute Resolution Society for Fall 2020, a member of the Brendan Moore Trail Advocacy team, and secretary of the Fordham Art Law Society. She holds a B.A. in English and Physiology from University of Minnesota: Twin Cities.