“The EEOC is keenly aware that [artificial intelligence and algorithmic decision-making] tools may mask and perpetuate bias or create new discriminatory barriers to jobs. We must work to ensure that these new technologies do not become a high-tech pathway to discrimination.”

Statement from EEOC Chair Charlotte A. Burrows in late October 2021 announcing the employment agency’s launching an initiative to ensure artificial intelligence (AI) and other emerging tools used in hiring and other employment decisions comply with federal civil rights laws.

The EEOC is not alone in its concerns about the use of AI, machine learning, and related technologies in employment decision-making activities. On March 25, 2022, California’s Fair Employment and Housing Council discussed draft regulations regarding automated-decision systems. The draft regulations were informed by testimony at a hearing last year the Department of Fair Employment and Housing (DFEH) held on Algorithms & Bias.

Algorithms are increasingly making significant impacts on people’s lives, including in connection with important employment decisions, such as job applicant screening. Depending on the design of these newer technologies and the data used, AI and similar tools risk perpetrating biases that are hard to detect. Of course, the AI conundrum is not limited to employment. Research in the US and China, for example, suggests AI biases can lead to disparities in healthcare.

Under the draft regulations, the DFEH attempts to update its regulations to include newer technologies such as algorithms it refers to as an “automated decision system” (ADS). The draft regulation defines ADS as: a computational process, including one derived from machine-learning, statistics, or other data processing or artificial intelligence techniques, that screens, evaluates, categorizes, recommends, or otherwise makes a decision or facilitates human decision making that impacts employees or applicants.

Examples of ADS include:

  • Algorithms that screen resumes for particular terms or patterns
  • Algorithms that employ face and/or voice recognition to analyze facial expressions, word choices, and voices
  • Algorithms that employ gamified testing that include questions, puzzles, or other challenges are used to make predictive assessments about an employee or applicant to measure characteristics including but not limited to dexterity, reaction time, or other physical or mental abilities or characteristics
  • Algorithms that employ online tests meant to measure personality traits, aptitudes, cognitive abilities, and/or cultural fit

The draft regulations would make it unlawful for an employer or covered entity to use qualification standards, employment tests, ADS, or other selection criteria that screen out or tend to screen out an applicant or employee or a class of applicants or employees based on characteristics protected by the Fair Employment and Housing Act (FEHA), unless the standards, tests, or other selection criteria, are shown to be job-related for the position in question and are consistent with business necessity.

The draft regulations include rules for both the applicant selection and interview processes. Specifically, the use of and reliance upon ADS that limit or screen out or tend to limit or screen out applicants based on protected characteristics may constitute a violation of the FEHA.

The draft regulations would expand employers’ record-keeping requirements by requiring them to include machine-learning data as part of the record-keeping requirement, and by extending the retention period for covered records under the current regulations from two to four years. Additionally, the draft regulations would add a record retention requirement for any person “who engages in the advertisement, sale, provision, or use of a selection tool, including but not limited to an automated-decision system, to an employer or other covered entity.” These persons, who might include third-party vendors supporting employers’ use of such technologies, would be required to retain records of the assessment criteria used by the ADS for each employer or covered entity.

During the March 25th meeting, it was stressed that the regulations are intended to show how current law applies to new technology and not intended to propose new liabilities. This remains to be seen as the effect of these new regulations, if adopted, could expand exposure to liability or at least more challenges to employers leveraging these technologies.

The regulations are currently in the pre-rule-making phase and the DFEH is accepting public comment on the regulations. Comments about the regulations can be submitted to the Fair Employment and Housing Council at FEHCouncil@dfeh.ca.gov.

Jackson Lewis will continue to track regulations affecting employers. If you have questions about the use of automated decision-making in the workplace or related issues, contact the Jackson Lewis attorney with whom you regularly work.

Photo of Joseph J. Lazzarotti Joseph J. Lazzarotti

Joseph J. Lazzarotti is a principal in the Berkeley Heights, New Jersey, office of Jackson Lewis P.C. He founded and currently co-leads the firm’s Privacy, Data and Cybersecurity practice group, edits the firm’s Privacy Blog, and is a Certified Information Privacy Professional (CIPP)…

Joseph J. Lazzarotti is a principal in the Berkeley Heights, New Jersey, office of Jackson Lewis P.C. He founded and currently co-leads the firm’s Privacy, Data and Cybersecurity practice group, edits the firm’s Privacy Blog, and is a Certified Information Privacy Professional (CIPP) with the International Association of Privacy Professionals. Trained as an employee benefits lawyer, focused on compliance, Joe also is a member of the firm’s Employee Benefits practice group.

In short, his practice focuses on the matrix of laws governing the privacy, security, and management of data, as well as the impact and regulation of social media. He also counsels companies on compliance, fiduciary, taxation, and administrative matters with respect to employee benefit plans.

Privacy and cybersecurity experience – Joe counsels multinational, national and regional companies in all industries on the broad array of laws, regulations, best practices, and preventive safeguards. The following are examples of areas of focus in his practice:

  • Advising health care providers, business associates, and group health plan sponsors concerning HIPAA/HITECH compliance, including risk assessments, policies and procedures, incident response plan development, vendor assessment and management programs, and training.
  • Coached hundreds of companies through the investigation, remediation, notification, and overall response to data breaches of all kinds – PHI, PII, payment card, etc.
  • Helping organizations address questions about the application, implementation, and overall compliance with European Union’s General Data Protection Regulation (GDPR) and, in particular, its implications in the U.S., together with preparing for the California Consumer Privacy Act.
  • Working with organizations to develop and implement video, audio, and data-driven monitoring and surveillance programs. For instance, in the transportation and related industries, Joe has worked with numerous clients on fleet management programs involving the use of telematics, dash-cams, event data recorders (EDR), and related technologies. He also has advised many clients in the use of biometrics including with regard to consent, data security, and retention issues under BIPA and other laws.
  • Assisting clients with growing state data security mandates to safeguard personal information, including steering clients through detailed risk assessments and converting those assessments into practical “best practice” risk management solutions, including written information security programs (WISPs). Related work includes compliance advice concerning FTC Act, Regulation S-P, GLBA, and New York Reg. 500.
  • Advising clients about best practices for electronic communications, including in social media, as well as when communicating under a “bring your own device” (BYOD) or “company owned personally enabled device” (COPE) environment.
  • Conducting various levels of privacy and data security training for executives and employees
  • Supports organizations through mergers, acquisitions, and reorganizations with regard to the handling of employee and customer data, and the safeguarding of that data during the transaction.
  • Representing organizations in matters involving inquiries into privacy and data security compliance before federal and state agencies including the HHS Office of Civil Rights, Federal Trade Commission, and various state Attorneys General.

Benefits counseling experience – Joe’s work in the benefits counseling area covers many areas of employee benefits law. Below are some examples of that work:

  • As part of the Firm’s Health Care Reform Team, he advises employers and plan sponsors regarding the establishment, administration and operation of fully insured and self-funded health and welfare plans to comply with ERISA, IRC, ACA/PPACA, HIPAA, COBRA, ADA, GINA, and other related laws.
  • Guiding clients through the selection of plan service providers, along with negotiating service agreements with vendors to address plan compliance and operations, while leveraging data security experience to ensure plan data is safeguarded.
  • Counsels plan sponsors on day-to-day compliance and administrative issues affecting plans.
  • Assists in the design and drafting of benefit plan documents, including severance and fringe benefit plans.
  • Advises plan sponsors concerning employee benefit plan operation, administration and correcting errors in operation.

Joe speaks and writes regularly on current employee benefits and data privacy and cybersecurity topics and his work has been published in leading business and legal journals and media outlets, such as The Washington Post, Inside Counsel, Bloomberg, The National Law Journal, Financial Times, Business Insurance, HR Magazine and NPR, as well as the ABA Journal, The American Lawyer, Law360, Bender’s Labor and Employment Bulletin, the Australian Privacy Law Bulletin and the Privacy, and Data Security Law Journal.

Joe served as a judicial law clerk for the Honorable Laura Denvir Stith on the Missouri Court of Appeals.