On January 1, 2023, New York City became the first jurisdiction in the United States to regulate employers’ use of automated employment decision tools (AEDTs) in the hiring and promotion process. Local Law 144 (the NYC AEDT Law), which requires anyone who uses (or wants to use) an AEDT to first conduct a bias audit and notify job candidates, is set to be enforced starting July 5, 2023. The NYC Department of Consumer and Worker Protection (DCWP) adopted its final rules on April 6, 2023, following two previous draft proposals, the first in September 2022 and the second in December 2022; a significant volume of comments (including from employers, employment agencies, law firms, AEDT developers and advocacy organizations); and two public hearings.
As companies increasingly rely on automated tools to assist in the employment process and sort through large volumes of job applicants, New York City employers will now be forced to determine whether those tools constitute an AEDT and, if so, secure the required bias audit from its AEDT vendors or conduct one themselves.
The NYC AEDT Law defines an AEDT as:
“any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.”
The final rules clarify that the phrase “to substantially assist or replace discretionary decision-making” means (i) to rely solely on a simplified output (score, tag, classification, ranking, etc.) with no other factors considered; (ii) to use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set; or (iii) to use a simplified output to overrule conclusions derived from other factors, including human decision-making.
The final rules further clarify the definition of “machine learning, statistical modeling, data analytics, or artificial intelligence” as:
“a group of mathematical, computer-based techniques:
- that generate a prediction, meaning an expected outcome for an observation, such as an assessment of a candidate’s fit or likelihood of success, or that generate a classification, meaning an assignment of an observation to a group, such as categorizations based on skill sets or aptitude; and
- for which a computer at least in part identified the inputs, the relative importance placed on those inputs, and, if applicable, other parameters for the models in order to improve the accuracy of the prediction or classification.”
The DCWP indicates that these changes both serve to focus the definition of AEDT and expand its scope. However, the example included in the final rules, an AEDT used to screen resumes and schedule interviews for a job posting, highlights the breadth of the law. The DCWP notes that for such use, a bias audit is still necessary even though the employer is not using the AEDT to make the final hiring decision but only to screen at an early point in the application process.
Bias Audit Requirements
The NYC AEDT Law prohibits employers and employment agencies from using an AEDT unless:
(1) The tool has been subject to a bias audit within one year of the use of the tool
(2) A summary of the bias audit has been made publicly available on the employer’s or employment agency’s website
Such bias audits must calculate the selection rate for each race/ethnicity and sex category that is required to be reported to the U.S. Equal Employment Opportunity Commission (EEOC) pursuant to the EEO-1 Component 1 report and compare the selection rates with the most selected category to determine an impact ration. The final rules provide explicit requirements depending on how the tool operates—selecting candidates or employees to move forward in the hiring process or classifying them into groups versus scoring candidates or employees.
Bias audits must rely on historical data or on test data if the available historical data is insufficient to conduct a statistically significant bias audit. If more than a year has passed since the most recent bias audit of the AEDT, the employer or employment agency may not continue the use of the tool.
While the artificial intelligence (AI) auditing industry is still very nascent, with industry standards and best practices still in development, New York City has indicated that the onus is on employers to provide proof of the outcome-focused bias audit but has noted that multiple employers may use the same audit, provided the audit includes the employer’s own historical data.
Several changes regarding bias audit requirements ultimately present in the final rules were oriented toward the following:
- Clarifying that an “independent auditor” may not be employed or have a financial interest in an employer/employment agency that seeks to use or continue to use an AEDT or in a vendor that developed/distributed the AEDT
- Clarifying that the required “impact ratio” must be calculated separately to compare sex categories, race/ethnicity categories and intersectional categories
- Adding a requirement that the bias audit indicate the number of individuals the AEDT assessed who are not included in the calculations because they fall within an unknown category, and requiring that number be included in the summary of results
- Allowing an independent auditor to exclude a category that comprises less than 2 percent of the data being used for the bias audit from the calculations of impact
- Clarifying that an employer or employment agency may rely on a bias audit of an AEDT that uses historical data of other employers or employment agencies only if it has provided historical data from its own use of the AEDT to the independent auditor conducting the bias audit or if it has never used the AEDT
- Providing examples of when an employer or employment agency may rely on a bias audit conducted with historical data, test data, or historical data from other employers and employment agencies.
The NYC AEDT Law also requires that candidates or employees who reside in the city be notified about the use of an AEDT in their assessment or evaluation for hiring or promotion, as well as the job qualifications and characteristics used by the AEDT. Such notice must be provided 10 business days prior to the use of the tool and allow the candidate or employee to request an alternative selection process or accommodation.
Notice can be provided in several ways. For candidates and employees, notice may be provided in a job posting or via U.S. mail or email. For candidates, notice may be provided on the careers or job section of the website; for employees, notice may be provided in a written policy or procedure. The employer or employment agency must also disclose on its website or make available to a candidate or employee within 30 days of receiving a written request (1) information about the type of data collected for the AEDT, (2) the source of the data collection and (3) the employer’s or employment agency’s data retention policy.
While the required notice must include instructions for how an individual can request an alternative selection process or reasonable accommodation under other laws if available, the final rules clarify that an employer or employment agency is not obligated to provide such an alternative selection process,at least under the NYC AEDT Law.
If an AEDT is in use on the law’s enforcement date, employers and employment agencies should be prepared to publish the required notice by June 20, 2023.
Violations of the law will result in civil penalties of up to $500 for the first violation and each additional violation occurring on the same day, and between $500 and $1,500 for each subsequent violation. Each day the AEDT is used in violation of the law constitutes a separate violation, and the failure to provide notice constitutes a separate violation.
The NYC AEDT Law is just the first among several emerging laws and regulations governing automated decision tools in the hiring process. In 2021, the EEOC launched an agencywide Artificial Intelligence and Algorithmic Fairness Initiative to ensure the use of AI, machine learning and other emerging technologies used in hiring and employment decisions comply with federal civil rights laws enforced by the agency. As EEOC Chair Charlotte A. Burrows stated, “the EEOC is keenly aware that these tools may mask and perpetuate bias or create new discriminatory barriers to jobs. We must work to ensure that these new technologies do not become a high-tech pathway to discrimination.”
At the state level, New York and New Jersey have introduced similar bills. Notably, neither bill, like the NYC AEDT Law, prohibits the use of these tools if the bias audit shows a discriminatory effect. The 2022 New Jersey bill does not even require publication of the bias audit results. Rather, New Jersey’s bill A4909 prohibits the sale of AEDTs unless the tool (1) is subject to a bias audit, (2) includes an annual bias audit service at no additional cost and (3) is sold with a notice stating the tool is subject to the law. If passed, the law would further require employers to notify each candidate within 30 days that the tool was used in connection with the candidate’s application for employment and that it assessed the job qualifications or characteristics of the candidate. The bill proposes civil penalties similar to those of the NYC AEDT law.
New York State’s 2023 bill A00567 provides requirements similar to those of the NYC AEDT Law but instead calls the bias audit a “disparate impact analysis.” In addition to making a summary of the disparate impact analysis available on the employer’s or employment agency’s website, the bill also requires that the summary be provided to the New York Department of Labor on an annual basis. The law does not include civil penalties but instead permits enforcement by the state attorney general and commissioner of labor.