The expanded use of artificial intelligence (AI) in the delivery of health care continues to receive increased attention from lawmakers across the country. Although AI regulation is still in its early developmental stages, there are various efforts underway to address the unintended negative consequences stirred by AI technology, particularly in health care and other key sectors. Of particular interest are regulatory efforts to restrict discrimination through AI and related technologies.
By way of background, AI involves the use of machine learning technology, software, automation, and algorithms to perform tasks and make rules and predictions based on existing datasets and instructions. Of particular significance, there is tremendous concern that use of AI technologies may produce unintended discriminatory results. For example, AI technology may be trained using data that is not representative of the actual population (e.g., data focused on one racial group or gender to the exclusion of others), which could then result in patterns of bias and discrimination that are effectively baked into the AI technology. Similarly, AI technologies are often trained to recognize patterns as well as to focus on “dominant” patterns to the exclusion of others.
California is once again at the forefront of proposing legislation to tackle this rapidly evolving landscape.
California’s Proposed AI Legislation in Health Care
On February 17, 2023, California Assemblymember Pilar Schiavo introduced California Assembly Bill 1502. which seeks to ban health care service plans or health insurers from discriminating on the basis of race, color, national origin, sex, age, or disability through use of clinical algorithms in decision making. While California has adopted other laws that similarly prohibit discrimination, Assembly Bill 1502 would specifically target use of AI. To be clear, the bill would not operate to prohibit use of clinical algorithms that rely on variables to appropriately make decisions, including to identify, evaluate and address health disparities.
On January 30, 2023, California State Assemblymember Rebecca Bauer-Kahan introduced Assembly Bill 331, which provides a general framework for regulating algorithmic discrimination in the use of automated decision tools that make consequential decisions. “Consequential decisions” are those decisions that surround actions affecting certain enumerated individual rights and opportunities, such as rights associated with health care and health insurance. In addition, Assembly Bill 331 also seeks to address demonstrated algorithmic harms in employment, education, housing, and financial services. The bill’s requirements specifically attach rights and responsibilities to the pertinent stakeholders, including those that use the tools to make consequential decisions (referred to as “Deployers”), although it would exempt very small deployers and those that create such tools (referred to as “Developers”).
In ensuring safe and effective systems and algorithmic discrimination protections, the bill lays out certain requirements including, but not limited to, performance of impact assessments, notice and explanation requirements, opt-out request mechanisms, disclosure obligations, and governance programs that must incorporate reasonable administrative and technical safeguards. Deployers would be required to adhere to each of the foregoing components to effectively utilize the automated decision tools and mitigate foreseeable risks of algorithmic discrimination. Assembly Bill 331 would also create a private right of action against Deployers for uses that result in algorithmic discrimination and would also empower the California Attorney General and other public attorneys to bring a civil action against a Developer or Deployer for injunctive relief, declaratory relief and reasonable attorney’s fees and litigation costs.
With the uptick of AI and other machine learning technologies in health care services and products, it is imperative that health care entities conduct critical assessments and execute appropriate interventions to mitigate the risk of any perceived discriminatory impact by evaluating their use with the relevant stakeholders, in consultation with legal counsel. If the proposed bills progress through the California legislature, they have the potential to set a standard for how other states approach this issue. In the event bills do not pass, they nevertheless provide valuable insight as to how lawmakers may approach regulating AI technology in health care moving forward.
We will continue to monitor the progress of the proposed bills through the California legislature. If you have any questions about this recent legislation or its impact on your organization, please contact a member of the Sheppard Mullin Healthcare Team.
 These efforts are demonstrated in the Biden Administration’s Blueprint for AI Bill of Rights published in October 2022 as well as in legislative efforts to address use of AI in the healthcare space, such as use of AI when conducting certain medical diagnoses, guiding nurses’ decision-making, and regulating use of AI to support eye assessments.
 A “health care service plan” includes: “(1) Any person who undertakes to arrange for the provision of health care services to subscribers or enrollees, or to pay for or to reimburse any part of the cost for those services, in return for a prepaid or periodic charge paid by or on behalf of the subscribers or enrollees. (2) Any person, whether located within or outside of this state, who solicits or contracts with a subscriber or enrollee in this state to pay for or reimburse any part of the cost of, or who undertakes to arrange or arranges for, the provision of health care services that are to be provided wholly or in part in a foreign country in return for a prepaid or periodic charge paid by or on behalf of the subscriber or enrollee.” Cal. Health & Safety Code § 1345(f).
 A “health insurer” includes a range of insurers which are regulated under California’s Insurance Code.
 Assembly Bill 331 is currently pending in committee under submission.