On April 25, 2023, four federal agencies — the Department of Justice (“DOJ”), Federal Trade Commission (“FTC”), Consumer Financial Protection Bureau (“CFPB”), and Equal Employment Opportunity Commission (“EEOC”) — released a joint statement on the agencies’ efforts to address discrimination and bias in automated systems.
The statement applies to “automated systems,” which are broadly defined “to mean software and algorithmic processes” beyond AI. Although the statement notes the significant benefits that can flow from the use of automated systems, it also cautions against unlawful discrimination that may result from that use.
The statement starts by summarizing the existing legal authorities that apply to automated systems and each agency’s guidance and statements related to AI. Helpfully, the statement serves to aggregate links to key AI-related guidance documents from each agency, providing a one-stop-shop for important AI-related publications for all four entities. For example, the statement summarizes the EEOC’s remit in enforcing federal laws that make it unlawful to discriminate against an applicant or employee and the EEOC’s enforcement activities related to AI, and includes a link to a technical assistance document. Similarly, the report outlines the FTC’s reports and guidance on AI, and includes multiple links to FTC AI-related documents.
After providing an overview of each agency’s position and links to key documents, the statement then summarizes the following sources of potential discrimination and bias, which could indicate the regulatory and enforcement priorities of these agencies.
- Data and Datasets: The statement notes that outcomes generated by automated systems can be skewed by unrepresentative or imbalanced data sets. The statement says that flawed data sets, along with correlation between data and protected classes, can lead to discriminatory outcomes.
- Model Opacity and Access: The statement observes that some automated systems are “black boxes,” meaning that the internal workings of automated systems are not always transparent to people, and thus difficult to oversee.
- Design and Use: The statement also notes that flawed assumptions about users may play a role in unfair or biased outcomes.
We will continue to monitor these and related developments across our blogs.