23351
post-template-default,single,single-post,postid-23351,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

The Twentieth Anniversary of the CDA & the Changing Role of Social Media Companies

CDA

The Twentieth Anniversary of the CDA & the Changing Role of Social Media Companies

The Communications Decency Act (“CDA”), an old but growingly more important statute, marked its twentieth anniversary this year since it became law in 1996.[1] Before going into the details of the CDA and the issues it raises, lets discuss a recently decided case that brought it back into the spotlight.

 

The California district court dismissed a claim by the widow of an American killed in Jordan who accused Twitter of giving a voice to the Islamic State.[2] Why is Twitter not liable? Among other reasons, Twitter was protected from secondary liability under Section 230 of the CDA, a statute that shields online service providers like Twitter and Facebook from accusations for content users post onto their platforms.[3]

 

The CDA provision was intended to allow online service providers to expand their services without fearing secondary liability for its users’ actions.[4] There is no doubt that the CDA has helped online providers prosper and expand, providing new opportunities for people to freely express themselves. [5]

 

Social networking platforms in particular have been heavily protected by the CDA.[6] Social networking platforms are unique among online providers because users expose a large amount of personal information and opinion onto its platforms.[7] Thus, social networking platforms are prone to abuse by its users, in the form of terrorist organization recruitment, cyberbullying, and hate speech.[8] These abusive uses of social networking platforms can eventually lead to human rights abuses, and there would be no one to stop them. [9]

 

I believe that the most effective way to approach this issue is by bringing into spotlight social media companies who hide behind the CDA. Social media companies cannot escape responsibility for online behavior on its platforms, especially because these companies are in the best position to minimize abuse on their platforms.

 

An important step is for social media companies to implement internal regulations. For example, social media companies must be transparent about how much online abuse occurs on their platforms.[10] Without data on how much abuse is occurring on platforms, society cannot scrutinize social media companies’ response or hold them accountable to their users. [11] Furthermore, social media companies have a duty to create transparent policies on how they will censor content on their platforms. [12] Transparent policies will make clear to users what content will be censored on platforms, and ultimately protect freedom of speech on platforms. [13]

Why burden the corporation with such responsibility? The UN Guiding Principles on Business and Human Rights states that the corporation has a responsibility to respect human rights. [14] This means that companies have a responsibility to avoid negative impacts on human rights and to address such impacts where they do occur.[15] Thus, social media companies can and should address the potential human rights issues that arise from harmful content published onto their platforms by putting in place appropriate policies and processes in a proactive effort to respect human rights.[16]

Footnotes[+]

Lisa Matsue

Lisa Matsue is a second year student at Fordham University School of Law and a staff member of the Fordham Intellectual Property, Media & Entertainment Law Journal. Before law school, she worked as a news programmer at a television company in Tokyo, Japan.