27066
post-template-default,single,single-post,postid-27066,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Crackdown on Content Regulation: Militia Group Conspires Through Facebook Despite Several User Reports

Crackdown on Content Regulation: Militia Group Conspires Through Facebook Despite Several User Reports

As technology continues to make rapid advancements, content moderation by social media platforms has increasingly become a pressing issue of public debate. Social media platforms, such as Facebook, have revolutionized the way people communicate with each other by creating a virtual space where individuals can reconnect with old friends, stay up to date with family and loved ones, and find niche communities to communicate with those who have similar interests. Despite the ease with which these platforms allow their users to communicate and share ideas, content moderation is necessary to protect social media consumers from being inundated with undesirable or disturbing content.[1]

A shooting which took place at a protest in Kenosha, Wisconsin on August 25, 2020 has illuminated the inadequacy of Facebook’s content moderation.[2] The day before the murders, the Kenosha Guard, a 3,000-member Facebook group, created a Facebook event encouraged individuals to take up arms and “defend our city” from the “evil thugs,” referring to the protesters.[3] Along with hundreds of RSVPs, the event’s anticipated attendees posted comments such as “I fully plan to kill looters and rioters tonight.”[4]

Hoping that Facebook would utilize its newly implemented policy of banning militia groups and groups who incite violence, users sent more than four hundred complaints to the company regarding the event.[5] The reports elicited four manual and several automated reviews of the Kenosha Group event page.[6] After these reviews, Facebook responded to user reports that the event “doesn’t go against one of our specific Community Standards,” and failed to take down the event page.[7]

Among those who responded to the Kenosha Group’s call was 17-year old Kyle Rittenhouse, who drove across state lines with an assault rifle.[8] At the protest, Rittenhouse shot and killed two protestors and injured a third.[9] The Kenosha Guard’s page was subsequently flooded with comments such as “1 protester dead got shot in the head…and a couple more got shot. Gotta love it.”[10] The page was later taken down.[11] Facebook’s CEO, Mark Zuckerberg, publicly apologized and called the failure to remove Kenosha Guard’s page prior to the protest an “operational mistake.”[12]

The partner of one of the victims, along with three other individuals, filed a lawsuit in October 2020 alleging Facebook’s failure to delete the Kenosha Group pages led Rittenhouse to allegedly kill two people.[13] The plaintiffs argue that Facebook was aware of the pages due to over four hundred filed complaints and that the deaths could have been prevented had Facebook taken action.[14] Furthermore, the lawsuit claims Facebook enabled militia groups to recruit and conspire.[15]

The NYU Stern Center for Business and Human Rights released a study acknowledging that all major social media platforms suffer from the same content moderation problem.[16] The report attributes Facebook’s inadequate content moderation to the company’s relentless focus on growth without a parallel strategy to regulate the dissemination of dangerous content.[17] To save money, Facebook uses artificial intelligence, which has severe limitations, and outsources most of its content moderation the third-party vendors, who often fail to recognize the gravity of the content they are reviewing.[18] The report estimates about three million items are flagged by users or artificial intelligence each day with a 10% rate of error, leaving around 300,000 content moderation mistakes per day.[19]

Facebook and other social media platform users have been pushing for the companies to crack down on content moderation to make the virtual spaces safer and more comfortable for all to use.[20] As one of the Kenosha Group’s reporters stated, “[t]hey should be taking action against it before people get killed, not after.”[21]

Footnotes[+]

Laura Rann

Laura Rann is a second-year J.D. candidate at Fordham University and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She is also a member of the Dispute Resolution Society, symposium coordinator of the Media & Entertainment Law Society, a 1L Student Advisor, and a Lexis Student Representative. She holds a B.A. in Music from the University of Georgia.