40359
post-template-default,single,single-post,postid-40359,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.4,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Balancing Technology and Rights: New York’s Proposed Bill on Electronic Monitoring and Automated Employment Decision Tools

Balancing Technology and Rights: New York’s Proposed Bill on Electronic Monitoring and Automated Employment Decision Tools

As the integration of Artificial Intelligence (AI) in employment processes becomes more prevalent, concerns around privacy and discrimination are at the forefront. To address these concerns, New York Senate Bill 7623, introduced on August 4, 2023, seeks to crack down on the use of Automated Employment Decision Tools (AEDTs) and Electronic Monitoring Tools (EMTs) by expanding current state law and introducing new regulations.[1]

Electronic Monitoring Tools

The bill defines electronic monitoring tools (EMT) as “any system that facilitates the collection of data concerning worker activities or communications by any means other than direct observation, including the use of a computer, telephone, wire, radio, camera, electromagnetic, photoelectronic, or photo-optical system.”[2] In the age of the pandemic and remote work, business managers rely heavily on electronic monitoring tools to keep employees on track and increase work productivity.[3] While beneficial to the company, these types of tools have raised major privacy concerns.[4]

The bill would make it unlawful for employers or employment agencies to use an electronic monitoring tool “to surveil employees” residing in the state unless it is “strictly necessary” and “the least invasive means to the employee that could reasonably be used” to accomplish the allowable purpose.[5] Allowable purposes are identified as the following:

  1. Allowing a worker to accomplish essential job functions;
  2. Monitoring production processes or quality;
  3. Assessment for worker performance;
  4. Ensuring compliance with employment, labor, or other relevant laws;
  5. Protecting the health, safety, or security of workers;
  6. Administrating wages and benefits; or
  7. Additional purposes to enable business operations as determined by the department.[6]

The suggested legislation also requires employers to give “clear and conspicuous” notice to employees regarding their intention to use EMTs.[7] The provisions for these notifications would notably expand current notification obligations set by S 2628, implemented on May 7, 2022.[8] Notifications under the new bill must convey specific details, such as: the permissible purpose; the precise data to be gathered; the instances, duration, and regularity of the data collection; and whether the acquired data will be integrated into an AEDT.[9]

Additional regulations include prohibiting the sale or transfer of data collected by an EMT, various restrictions on the collection, storage, and destruction of data collected by electronic monitoring tools, and a ban on using EMTs, including location-tracking applications, “to monitor employees who are off-duty and not performing work-related tasks.”[10] Lastly, the bill would further prohibit employers from requiring employees to install monitoring applications on personal devices unless strictly necessary to accomplish essential job functions.[11]

Automated Employment Decision

The proposed bill builds on the New York City Automated Employment Decisions Tools (“AEDT”) Law, which was implemented on July 5, 2023, effectively prohibits employers from using AEDT in hiring decisions unless they were first subjected to a third-party audit within a year of use.[12] The law, the first of its kind, came into effect to address concerns surrounding the use of algorithms and automated decision-making tools, resulting in possible discrimination and disparate effects on protected groups.[13] Examples include an increased likelihood for algorithms to favor “white-sounding” names over others[14] A relevant real-world instance can be seen in a recent case brought to the Equal Employment Opportunity Commission, which alleged national origin discrimination involving a tech professionals’ job-search website.[15] The site had a job listing that incorporated key terms such as “H1B,” “visa,” “only,” and “must.”[16] Although the usage of such language aimed to confirm the legal work status of the applicant in the United States, it inadvertently led to the exclusion of a whole protected category—American citizens, as they aren’t holders of H1B or visas.[17]

The bill’s definition of AEDT closely tracks the definition in the New York City law and would define an AEDT as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output,” such as a “score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making” to make employment-related decisions.[18] These decisions would encompass any that affect the pay, working hours, or job schedules of employees, or those utilized to assess performance when reviewing job applications, evaluating candidates for employment, or considering employees for termination or promotion.[19]

Similar to the New York City law, Senate Bill 7623 maintains that AEDTs must be subjected to an independent audit within one year of use and requires that a summary of results is made publicly available on the website of the employer or employment agency.[20] Even then, employers would be prohibited from relying solely on an output from an AEDT.[21]

Additionally, no less than ten days before use, subjected job candidates or employees residing in the state must be notified that an AEDT is being used, and they must provide the job qualifications and characteristics that the AEDT will use to make an assessment and “any outputs” the AEDT will produce.[22] The candidate must be allowed to request an “alternative selection process or accommodation” without consequence.[23]

Further, the bill would prohibit the use of an AEDT in a “manner as to unduly or extremely intensify the conditions of work or to harm the health and safety of employees,” to make predictions about employees’ behavior or personality not related to the job’s essential functions, collect biometric information (i.e., “facial recognition, gait, or emotion recognition technologies”), or “implement a dynamic wage-setting system that pays employees different wages for the same work.”[24]

So what could all this mean for you? These new regulations would have massive implications on how employers and employment agencies hire and maintain employees. All companies with operations and employees in New York City, even if they are not based there, would be subject to this new regulation.[25]  As a massive business hub, virtually every company in the world will be affected by these regulations. Depending on who you are, this could mean more freedom to catch up on your favorite Reddit thread on the company laptop or hours of sifting through resumes to double-check AI employment screening decisions.

Footnotes[+]

Madeline Hunter

Madeline Hunter is a second-year J.D. candidate at Fordham University School of Law and works as a Program Associate for the Computing Research Association. She is a staff member of the Intellectual Property, Media & Entertainment Law Journal and holds a B.A. in Political Science and Professional Writing from Miami University.