
Erasing Bias in Clinical Decision Support to Achieve True Health Equity
Many have claimed that technology will be the great equalizer in healthcare. While technology is objective and purely data-driven and without prejudice, there can still be inherent biases that hamper the ultimate goal of health equity.
Many have claimed that technology – from virtual care solutions to clinical decision support (CDS) systems – will be the great equalizer in healthcare.
These tools are viewed as an ideal solution for bridging the chasm that has prevented medically underserved communities - racial and ethnic minorities, those who live in rural and socioeconomically disadvantaged areas - from receiving the same quality of care as others. Yet, while technology is purportedly objective and purely data-driven and without prejudice, there can still be inherent biases that hamper the ultimate goal of health equity.
With
CDS systems have been designed to help providers assess the vast amounts of data that can play a role in care decisions. But CDS should not be all about guideline-driven care; it should also facilitate health equity. As new technology is developed and current capabilities are advanced, it’s critical we seek to combat bias through the guidelines themselves and the underlying clinical algorithms that drive CDS.
Overcoming Bias in Healthcare
Health equity implies, “a system designed and built for inclusion and equal access to the best health outcomes for all,"
However, the agency believes, in reality, the definition is more expansive, referring to "inclusion and best health outcomes for all no matter their physical environment, personal behaviors and abilities, or social circumstances (gender identity, military services, sexual orientation, citizenship status, social status, history of incarceration, culture and tradition, social connectedness, work conditions, early childhood education and development)."
Overcoming all of these factors can be difficult for the average provider. One study utilizing the Implicit Association Test (IAT) revealed that
For example, a groundbreaking Institute of Medicine report found that Black Americans, who are
The result: Black Americans are
While much of this problem can stem from providers' attitudes, implicit biases and discrimination, it's likely that clinical algorithms, tools and guidelines are factoring into the biases as well. The
Refining Clinical Decision Support for Greater Health Equity
CDS has been defined as a “process for enhancing health-related decisions” that provides “clinicians, staff, patients or other individuals with
However, to truly support both guideline-driven and equity-driven care, the developers of the algorithms and the CDS systems themselves must strive to eliminate bias.
A study completed in 2019 explained how a
Additionally, the National Kidney Foundation (NKF) recently announced
In spring of 2021, the AHRQ, at the request of Congress,
Done right – through careful design, greater awareness and a thoughtful approach to eliminating potential biases – algorithms and CDS systems can mitigate such problems. And in the process, they can decrease disparities in care to achieve greater health equity.
Additionally, when algorithms are developed based on data that accurately represents the diversity of the patient population to which it is being applied, they can also achieve greater patient outcomes. Therefore, demographic, clinical, socioeconomic and even patient preference data should be included when leveraging CDS systems. This data may not be available in the EHR directly, and can require retrieving SDOH and PRO data from other sources.
Looking ahead, it is vital that designers of CDS solutions think about and plan for health equity from the start of development. By taking this approach, CDS can truly deliver on the promise of fostering both guideline-driven and equity-driven care that improves outcomes and the overall health of everyone.
Newsletter
Get the latest industry news, event updates, and more from Managed healthcare Executive.

















































