
In letter to hospital CEO’s, California Attorney General Rob Bonta asked how healthcare facilities and other providers were addressing racial and ethnic disparities in commercial decision-making tools and algorithms.
In other words: He was coming after their algorithmic biases.
Some takeaways:
- While there are many factors that contribute to disparities in healthcare access, quality,
and outcomes, research suggests that bias in decision-making tools or algorithms is likely a contributor. - The California Attorney General’s Office is committed to addressing disparities in
healthcare and assuring compliance with state non-discrimination laws in hospitals and other healthcare settings. - The AG is asking for information about the decision-making tools used in hospitals that could be contributing to racially biased treatment or outcomes. Including:
- A list of all commercially available or purchased decision-making tools, products, software systems, or algorithmic methodologies currently in use in the hospital
- The purposes for which these tools are currently used, how these tools inform decisions, and any policies, procedures, training, or protocols that apply to use of these tools in the facilities.
- The name or contact information of the person(s) responsible for evaluating the purpose and use of these tools and ensuring that they do not have a disparate impact based on race or other protected characteristics.