38222
post-template-default,single,single-post,postid-38222,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

“Privacy. That’s iPhone.” Or is It?

“Privacy. That’s iPhone.” Or is It?

The tech giant is facing substantial criticism after announcing its plans to combat Child Sexual Abuse Material (“CSAM”) through the implementation of on-device photo scanning software, and then promptly reversing course and halting the planned updates until further notice.[1]

On August 5, 2021, Apple released a statement titled, “Expanded Protections for Children,” previewing the new iOS and iPadOS safety features to be incorporated in future iOS and iPadOS software updates.[2] These included updates to the Messages app, as well as the Siri and Search features of Apple devices; however, the most controversial update came in the form of the on-device photo scanning software, referred to as CSAM detection software.[3]

Under the proposed CSAM detection software, Apple would reduce each photo on a user’s device to a unique set of numbers, referred to as “hash.”[4] If the user chose to upload its photos to iCloud, Apple would then compare the hashes of a user’s photos with the hashes of images in a database of known CSAM.[5] Should a user surpass a threshold number of matches within the database, the case would be escalated to an Apple employee for individual review, and if the photos did in fact contain CSAM content, the user’s account would be disabled, and Apple would report the photos to the National Center for Missing and Exploited Children (“NCMEC”).[6]

Scanning consumer photos for CSAM content is a routine practice among tech moguls such as Facebook, Twitter, and Reddit.[7]. However, the critical difference in Apple’s proposed updates is that, for the first time in Apple products, the CSAM detection software would be implemented on users’ devices instead of within the iCloud server.[8]

While Apple’s goal of protecting children from CSAM is an undeniably worthy one, some are calling out the proposed software changes as a shift towards an “infrastructure of surveillance.”[9] Because the photo scanning would occur on the device (as opposed to within iCloud), a user’s decision to uncheck iCloud Photos may not matter if Apple was faced with governmental pressure to scan for other types of content on that user’s phone or iPad. [10]. Furthermore, as one commentator wrote, “the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.”[11]

In response to Apple’s CSAM detection software announcement, more than 90 global organizations signed a letter addressed to Apple, calling on the company to scrap its new plan for photo-scanning surveillance.[12] “Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable.”[13]

Despite Apple pressing pause on the update rollout, cybersecurity experts are still emphasizing the dangerous nature of the proposed on-device technology. In a recently released report, more than a dozen prominent cybersecurity experts described the software as “ineffective and dangerous strategies that would embolden government surveillance.”[14]

Specifically, the report identifies many critical ways in which on-device scanning can fail including by misinterpreting safe images as harmful ones, failing to uncover the harmful images at all, and institutionalizing “bulk surveillance systems launched on the public’s personal devices.”[15]

Susan Landau, a co-author of the report and a professor of cybersecurity policy at Tufts University, described this type of on-device technology as “scanning of a personal private device without probable cause for anything illegitimate being done.”[16] As a whole, these experts argue that the on-device technology proposed by Apple hardly presents a safer or more secure future for the world.[17]

Apple’s CSAM photo detection software highlights the uphill battle that technology companies face in protecting digital privacy while attempting to halt illicit activity on their platforms. On-device scanning software presents not only the potential for abuse by authoritarian governments, but also raises specific legal questions surrounding users’ rights, including whether Apple can make these programs mandatory, the amount of notice that is required for user compliance, whether the NCMEC could be considered as a “state actor” or a private corporation, and the extent of transparency involved in the NCMEC CSAM database.[18]

At present, there is no easy answer to the potential legal and policy questions that on-device scanning software proposes. But one thing is clear: digital privacy concerns are increasingly putting tech giants like Apple in between the rock of protecting user privacy and the hard place of eradicating illegal and immoral activity on their platforms.

Footnotes[+]

Frances McDonald

Frances McDonald is a second-year J.D. candidate enrolled in Fordham University School of Law’s evening division and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She holds a B.A. in Politics and International and Global Studies from Sewanee: The University of the South.