The full text of this Note may be found by clicking the PDF link on the right.
Biometric technology has become an increasingly common part of daily life. Although biometrics have been used for decades, recent advances and new uses have made the technology more prevalent, particularly in the private sector. This Note examines how widespread use of biometrics by the private sector is commodifying human characteristics. As the use of biometrics has become more extensive, it exacerbates and exposes individuals and industry to a number of risks and problems associated with biometrics. Despite public belief, biometric systems may be bypassed, hacked, or even fail. The more a characteristic is utilized, the less value it will hold for security purposes. Once compromised, a biometric cannot be replaced as would a password or other security device.
This Note argues that there are strong justifications for a legal structure that builds hurdles to slow the adoption of biometrics in the private sector. By examining the law and economics and personality theories of commodification, this Note identifies market failure and potential harm to personhood due to biometrics. The competing theories justify a reform to protect human characteristics from commodification. This Note presents a set of principles and tools based on defaults, disclosures, incentives, and taxation to discourage use of biometrics, buying time to strengthen the technology, educate the public, and establish legal safeguards for when the technology is compromised or fails.
n any given afternoon, a person shops at a grocery store, withdraws money from an ATM, and checks her smartphone a dozen times. Except she performs these tasks with a biometric: the grocery store implemented a system to pay with a fingerprint, the bank’s ATM requires a fingerprint instead of a PIN, and a fingerprint unlocks the screen of her smartphone. These uses of fingerprints are enormously convenient, and perhaps the individual feels more secure because her accounts are protected by something that is attached to her body. But how many other things did she touch that day? Probably door handles, coffee cups, light switches, tables, books, and countless other things. Does that mean she left her “password” or “key” on all these items? Suppose her bank notifies her that it suffered a data breach. How does she change her fingerprint?
Fingerprints are merely a type of biometric. The term biometrics is often used interchangeably to describe a characteristic or a method.1 As a characteristic, biometrics means measurable physiological or behavioral characteristics of a person that may be used for recognition.2 Measurable physiological characteristics include fingerprints, face, iris, retina, and hand geometry; examples of measurable behavioral characteristics are voice, keystroke, signature, and gait.3 As a method, biometrics means the process of automated recognition based on a person’s measurable characteristic.4 Biometric systems essentially make the human body “machine-readable.”5
Scholarly analysis of biometrics generally relates to government uses, such as national security and surveillance.6 However, this Note examines the rapidly expanding use of biometrics by the private sector. Does extensive use across industries accelerate the transformation of nonsalable attributes into market goods? For what purposes is it justified to use something so closely associated with oneself? This Note claims that widespread use of biometrics by the private sector is commodifying human characteristics and exacerbating other risks and problems associated with biometrics.
Biometrics are not new. For decades, law enforcement has used fingerprint analysis during criminal investigations.7 However, in the last few decades, technology has helped to automate the process and allow more human characteristics to be utilized for recognition.8 These technological advancements, coupled with growing concerns for terrorism and cybersecurity, are propelling the growth of biometric technology.9 Biometrics offer a number of advantages over other security systems. The characteristics are well-suited as identifiers because they are unique to each individual.10 Also, biometric identifiers are convenient; because humans carry the characteristic on their body at all times and it cannot be forgotten, biometrics eliminate the need to remember PINs and passwords or to carry identification documents.11
However, this Note demonstrates that the private sector’s use of biometrics raises significant privacy and security concerns. Privacy is about power over information, determining who should access and use information.12 Companies are beginning to collect biometrics in exchange for something else or without an individual’s knowledge. Because biometrics are easily obtained, individuals are left powerless over the collection and use of their characteristics. Similarly, if an individual is left with a binary choice of whether to provide biometrics or forgo a product, the collector has all the power in the transaction.
Security, on the other hand, determines who can actually access and use information; it implements the privacy choices.13 Biometrics are being used as a security measure; attributes are protecting other personal information. Most individuals believe that biometrics systems are accurate and secure. However, this Note demonstrates the alarming number of flaws in biometric systems, such as the countless ways in which biometrics can be hacked and compromised. Further, a significant risk with biometrics is that they are irreplaceable. Reliance is rapidly being placed on human attributes that cannot be changed. In a world where data breaches are common occurrences, individuals should be prepared to change passwords and other security measures frequently. The numerous risks associated with biometrics are accentuated as the technology becomes more prevalent.
This Note argues that widespread use, propelled by the private sector, causes more parties to be interested in biometrics. As more biometric systems are implemented, unique human characteristics become more commonplace, heightening concerns for irreplaceability and security. This Note demonstrates that competing theories of commodification justify reform to protect biometrics. The law and economics approach, which places all things in the free market, allows intervention when faced with an inefficient market. Extensive evidence demonstrates that the nature of privacy, biometrics, and human cognition result in market failure. A similar conclusion is reached when biometrics are analyzed under Margaret Radin’s personality theory, where personal attributes are too personal to be monetized. Rather, the noncommodified version of biometrics fosters personhood and improves social interactions.
This Note concludes that there are strong justifications for a legal structure that builds hurdles to slow the adoption of biometrics in the private sector. Based on choice architecture, this Note presents a system of defaults, disclosures, and incentives to push the private sector, and individuals, away from utilizing biometrics. There is no way to prevent the use of biometrics altogether, but forcing companies and individuals to slow down will give society time to consider the risks, fortify security, and build safeguards for when the technology is compromised or fails. The proposed principles and set of tools are consistent with the self-regulation and limited government regulation traditions of the United States.
Part I explains how biometric technology operates and how the private sector is using biometrics. The discussion assesses the security vulnerabilities and serious risks of using biometrics. Part II explores competing theoretical views of commodification and concludes that due to the nature of biometrics and problems with privacy, intervention and reform are needed to govern biometrics. Part III describes available legal tools that may be applied to biometrics. The discussion suggests that current legal structures are inadequate to govern biometrics in the United States, but that a hybrid solution may be more effective. Part IV proposes a set of principles to guide collection, use, and storage of biometrics by the private sector. The proposal attempts to establish hurdles to slow the adoption and discourage private entities and individuals from utilizing biometrics.
* Editor-in-Chief, Fordham Intellectual Property, Media & Entertainment Law Journal, Volume XXVI; J.D. Candidate, 2016, Fordham University School of Law; B.S., 2005, Boston College. The Author would like to thank Professor Shlomit Yanisky-Ravid for her invaluable wisdom and guidance throughout the development of this Note. The Author specially thanks her parents, Richard and Joy, for their unconditional love and support. The Author thanks her brother, James, for his encouragement to never stop learning.
See Clifford S. Fishman & Anne T. McKenna, Wiretapping And Eavesdropping § 31:1 (2013). This Note primarily uses “biometrics” to refer to human measurable characteristics and uses “biometric system” when discussing the recognition process. ↩
See NSTC Subcomm. On Biometrics, Biometrics “Foundation Documents” 1 (2006), available at http://www.biometrics.gov/Documents/biofoundationdocs.pdf [http://perma.cc/6ASU-Z9AJ] [hereinafter Foundation Documents]. ↩
See Ishwar K. Sethi, Biometrics: Overview and Applications, in Privacy And Technologies Of Identity: A Cross-Disciplinary Conversation 117, 117 (Katherine J. Strandburg & Daniela Stan Raicu eds., 2006). There is some debate as to whether DNA is a biometric because DNA recognition is not currently automated. See Foundation Documents, supra note 2, at 21. ↩
See Foundation Documents, supra note 2, at 1.↩
See Article 29 Data Protection Working Party, Opinion 3/2012 on Developments in Biometric Technology, 00720/12/EN, WP 193, at 4 (Apr. 27, 2012), available at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2012/wp193_en.pdf [http://perma.cc/M486-U47Z] [hereinafter WP 193]. ↩
See, e.g., Lauren D. Adkins, Biometrics: Weighing Convenience and National Security Against Your Privacy, 13 Mich. Telecomm. & Tech. L. Rev. 541 (2007); Laura K. Donohue, Technological Leap, Statutory Gap, and Constitutional Abyss: Remote Biometric Identification Comes of Age, 97 Minn. L. Rev. 407 (2012); Margaret Hu, Biometric ID Cybersurveillance, 88 Ind. L.J. 1475 (2013); Rudy Ng, Catching Up To Our Biometric Future: Fourth Amendment Privacy Rights and Biometric Identification Technology, 28 Hastings Comm. & Ent. L.J. 425 (2006). ↩
See Donohue, supra note 6, at 418–19; Nancy Yue Liu, Bio-Privacy: Privacy Regulations And The Challenge Of Biometrics 4 (2012). ↩
See Foundation Documents, supra note 2, at 7; Liu, supra note 7, at 10–11. ↩
See Liu, supra note 7, at 3. ↩
See Robyn Moo-Young, “Eyeing” the Future: Surviving the Criticisms of Biometric Authentication, 5 N.C. Banking Inst. 421, 422 (2001). It should be noted that biometrics are not truly universal as some individuals may not have a specific characteristic due to disease, birth defects, or other causes, which could lead to discrimination as biometric systems are more widely implemented. See Liu, supra note 7, at 68. ↩
See Robin Feldman, Considerations on the Emerging Implementation of Biometric Technology, 25 Hastings Comm. & Ent. L.J. 653, 662 (2003); Daniel J. Solove, Nothing To Hide: The False Tradeoff Between Privacy And Security 201 (2011). ↩
See Derek E. Bambauer, Privacy Versus Security, 103 J. Crim. L. & Criminology 667, 673 (2013).↩
See id. at 676–78. ↩