On September 6, 2023, California Governor Gavin Newsom signed Executive Order N-12-23 (the “Executive Order”) relating to the use of generative artificial intelligence (“GenAI”) by the State, as well as preparation of certain reports assessing the equitable use of GenAI in the public sector. The Executive Order instructs State agencies to look into the potential risks inherent with the use of GenAI and creates a blueprint for public sector implementation of GenAI tools in the near future. The Executive Order indicates that California is anticipating expanding the role that GenAI tools play in aiding State agencies to achieve their missions, while simultaneously ensuring that these State agencies identify and study any negative effects that the implementation of GenAI tools might have on residents of the State. The Executive Order covers a number of areas, including:
1. Risk-Analysis & Reporting. By early November 2023, State agencies are required to (i) collaborate to create a report on the potential beneficial uses of GenAI by the State and potential high-risk use cases of GenAI (e.g., where GenAI is used for consequential decision-making that affects access to essential goods or services) and (ii) conduct and submit to the California Department of Technology an inventory of current high-risk use cases of GenAI. Additionally, by March 2024, the California Cybersecurity Integration Center and the California State Threat Assessment Center will perform a joint-risk analysis of potential threats and vulnerabilities to California’s critical energy infrastructure by the use of Gen AI (including mass casualty or environmental emergencies).
2. Procurement Guidelines. By January 2024, State agencies are required to collaborate to issue guidelines for public sector procurement, use and required trainings relating to the use of GenAI by government agencies (which guidelines will be based on White House’s “Blueprint for an AI Bill of Rights” and National Institute for Science & Technology’s “AI Risk Management Framework”), and, by January 2025, to update California’s project approval, procurement, and contract terms, incorporating the feedback from such guidelines.
3. Impact Analysis Framework. By July 2024, State agencies are required to develop guidelines to analyze the impact that adopting GenAI may have on vulnerable communities, including criteria to evaluate equitable outcomes in the deployment and implementation of GenAI in high-risk use cases.
4. Adoption of Pilot Programs. State agencies are required to (i) consider procurement and enterprise use of GenAI where the use of GenAI can improve efficiency, effectiveness, accessibility or equity of government operations consistent with the guidelines for public sector GenAI procurement, (ii) establish infrastructure by March 2024 to conduct pilot programs utilizing GenAI in approved environments (called “sandboxes”) to test such projects, and (iii) by July 2024, test such pilot programs by considering how the use of such GenAI improves Californian’s experience with, and access to government services, and how GenAI can support State employees in performance of their duties.
5. Trainings. State agencies are required to, by July 2024, make available trainings to State employees on the use of State-approved GenAI tools to achieve equitable outcomes, and to identify and mitigate potential inaccuracies, fabrications, hallucinations, and biases.
6. Partnership with Stanford and UC Berkeley. Certain state agencies are directed to pursue formal partnerships with the University of California, Berkeley, College of Computing, Data Science and Society, and Stanford University’s Institute for Human-Centered Artificial Intelligence to consider and evaluate the impacts of GenAI on the State of California – including hosting a California-specific summit on the use of GenAI in 2024.
Although limited in scope, the Executive Order indicates that California takes seriously its role as a global technology leader, and that it will invest in public sector use of GenAI tools going forward, while at the same time being cautious of the risks inherent in the use of such tools.