New York City’s Push for Accountable Algorithms - Fordham Intellectual Property, Media & Entertainment Law Journal
24776
post-template-default,single,single-post,postid-24776,single-format-standard,ajax_fade,page_not_loaded,,select-theme-ver-3.3,wpb-js-composer js-comp-ver-5.5.5,vc_responsive
 

New York City’s Push for Accountable Algorithms

New York City’s Push for Accountable Algorithms

On January 11, 2018, New York City Mayor Bill de Blasio signed a bill into law that creates a task force to examine how the city’s agencies use algorithms to make decisions that can affect millions of New Yorkers.1 The bill, Int. No. 1696, was introduced by Council Member James Vacca of the Bronx in August of 2017,2 and is one of the first pieces of legislation in a major U.S. city that addresses the exponentially increasing prevalence of algorithmic decision making. While algorithms are vitally important to improve the efficiency of agency decision making and cope with rising populations, there is a concerning lack of transparency and accountability for their potential biases. For example, machine learning algorithms are often trained on datasets that include inherent socioeconomic biases.3 An early example of the ramifications of flawed statistical decision making was the New York Fire Department’s use of algorithmic analysis to decide which fire stations to close during the 1970s fiscal crisis. Biased data and flawed analysis by the Rand Corporation led the FDNY to close stations in minority communities, which led to the famous (and sadly fictional) quote from commentator Howard Cosell during the 1977 World Series: “Ladies and gentlemen, the Bronx is burning.”4

New York’s many administrative agencies use algorithms for decision making that affects almost every aspect of New York life. Algorithms are used to assign kids to schools, screen for benefit fraud, assess teacher performance, and design predictive policing behaviors.5 Nationally, some of the most controversial algorithms are used to assess recidivism risks for sentencing and parole decisions.6 A previous iteration of Council Member Vacca’s bill would have required the city’s agencies to release the source code of any algorithm they used to make decisions that affected individual New Yorkers. While this bill was undoubtedly well intentioned and designed to address the lack of transparency around algorithmic decision making, there were a number of flaws with the bill. For one, third party algorithm vendors are extremely reticent to release the source code of their proprietary algorithms, since IP protections for algorithms are unpredictable at best.7 Companies could be unwilling to provide algorithms to New York City agencies if they knew that their source code would have to be released to the public. Another issue is that the source code of an algorithm is unintelligible to the average member of the public.8

While Council Member Vacca’s bill defers concrete action until the task force concludes its reporting, which could take up to 18 months,9 it is promising to see a major city begin to examine the impacts of potential biases in algorithms. The task force will likely recommend actions such as releasing the training and validation datasets algorithms used to make sure they can be examined for bias. Beyond that, it would be wise to allow the public to input data into some of the algorithmic black boxes to see what comes out to make sure that that decision comports with the real-life agency decision making. New York’s policy decisions on algorithms may set the tone for how other governing bodies – local, state, and even federal – approach this thorny and technical issue going forward.


  1. NY Iɴᴛ. Nᴏ. 1696-2017, http://legistar.council.nyc.gov/LegislationDetail.aspx?ID=3137815&GUID=437A6A6D-62E1-47E2-9C42-461253F9C6D0 [https://perma.cc/3568-W4UQ].

  2. Id.

  3. Julia Angwin et al., Machine Bias, PʀᴏPᴜʙʟɪᴄᴀ (May 23, 2016), https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing [https://perma.cc/WS6V-DKAS].

  4. Jody Avirgan, Why the Bronx Really Burned, FɪᴠᴇTʜɪʀᴛʏEɪɢʜᴛ (Oct. 29, 2015, 4:16 PM), http://fivethirtyeight.com/features/why-the-bronx-really-burned [https://perma.cc/8PAZ-2VHV].

  5. Julia Powles, New York City’s Bold, Flawed Attempt to Make Algorithms Accountable, New Yorker (Dec. 20, 2017), https://www.newyorker.com/tech/elements/new-york-citys-bold-flawed-attempt-to-make-algorithms-accountable [https://perma.cc/5R4P-5CZG].

  6. Angwin, supra note 3, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing [https://perma.cc/WS6V-DKAS].

  7. Donald S. Chisum, Oɴ Pᴀᴛᴇɴᴛꜱ, § 1.03, (2017).

  8. Joshua A. Kroll et al., Accountable Algorithms, 165 U. Pa. L. Rev. 633, 638 (2017).[/ footnote] Even experts can struggle to explain how a black box algorithm with machine learning functions reached the outputs that it did.[footnote]Id.

  9. NY Iɴᴛ. Nᴏ. 1696-2017, http://legistar.council.nyc.gov/LegislationDetail.aspx?ID=3137815&GUID=437A6A6D-62E1-47E2-9C42-461253F9C6D0 [https://perma.cc/3568-W4UQ].

Tags:

Galen Stump

Galen Stump is a second-year student at Fordham University School of Law. Raised in New York, Galen was pulled back into the city’s clutches after graduating from the University of Vermont with majors in Political Science and History. He spent the summer with the Irish Supreme Court in Dublin.