24776
post-template-default,single,single-post,postid-24776,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

New York City’s Push for Accountable Algorithms

New York City’s Push for Accountable Algorithms

On January 11, 2018, New York City Mayor Bill de Blasio signed a bill into law that creates a task force to examine how the city’s agencies use algorithms to make decisions that can affect millions of New Yorkers.[1] The bill, Int. No. 1696, was introduced by Council Member James Vacca of the Bronx in August of 2017,[2] and is one of the first pieces of legislation in a major U.S. city that addresses the exponentially increasing prevalence of algorithmic decision making. While algorithms are vitally important to improve the efficiency of agency decision making and cope with rising populations, there is a concerning lack of transparency and accountability for their potential biases. For example, machine learning algorithms are often trained on datasets that include inherent socioeconomic biases.[3] An early example of the ramifications of flawed statistical decision making was the New York Fire Department’s use of algorithmic analysis to decide which fire stations to close during the 1970s fiscal crisis. Biased data and flawed analysis by the Rand Corporation led the FDNY to close stations in minority communities, which led to the famous (and sadly fictional) quote from commentator Howard Cosell during the 1977 World Series: “Ladies and gentlemen, the Bronx is burning.”[4]

New York’s many administrative agencies use algorithms for decision making that affects almost every aspect of New York life. Algorithms are used to assign kids to schools, screen for benefit fraud, assess teacher performance, and design predictive policing behaviors.[5] Nationally, some of the most controversial algorithms are used to assess recidivism risks for sentencing and parole decisions.[6] A previous iteration of Council Member Vacca’s bill would have required the city’s agencies to release the source code of any algorithm they used to make decisions that affected individual New Yorkers. While this bill was undoubtedly well intentioned and designed to address the lack of transparency around algorithmic decision making, there were a number of flaws with the bill. For one, third party algorithm vendors are extremely reticent to release the source code of their proprietary algorithms, since IP protections for algorithms are unpredictable at best.[7] Companies could be unwilling to provide algorithms to New York City agencies if they knew that their source code would have to be released to the public. Another issue is that the source code of an algorithm is unintelligible to the average member of the public.[8]

While Council Member Vacca’s bill defers concrete action until the task force concludes its reporting, which could take up to 18 months,[9] it is promising to see a major city begin to examine the impacts of potential biases in algorithms. The task force will likely recommend actions such as releasing the training and validation datasets algorithms used to make sure they can be examined for bias. Beyond that, it would be wise to allow the public to input data into some of the algorithmic black boxes to see what comes out to make sure that that decision comports with the real-life agency decision making. New York’s policy decisions on algorithms may set the tone for how other governing bodies – local, state, and even federal – approach this thorny and technical issue going forward.

Footnotes[+]

Tags:

Galen Stump

Galen Stump is a second-year student at Fordham University School of Law. Raised in New York, Galen was pulled back into the city’s clutches after graduating from the University of Vermont with majors in Political Science and History. He spent the summer with the Irish Supreme Court in Dublin.