For decades, the healthcare industry has operated under a comforting, if flawed, assumption: Data is objective. It has treated numbers, spreadsheets, and algorithms as neutral observers—mirrors that simply reflect the reality of patient health.

But as it leans further into the era of automated healthcare and AI-driven workflows, a difficult truth is being discovered. Data is not a mirror; it is a series of choices. It reflects what is valued enough to measure, how success is defined, and whose life experiences are considered standard.

When discussing the design of health equity, it is essential to acknowledge that equity doesn’t fail due to a lack of good intentions. It fails at the drafting table. It fails when systems are built to optimize for efficiency or cost without first asking: Efficiency for whom?

The Objective Data Trap

The gap in health equity often begins with the invisible patient. If a data system isn’t designed to capture social determinants of health (SDOH)—like transportation access, housing stability, or primary language—those factors effectively don’t exist in the eyes of the machine.

A 2019 study published in Science revealed a widely used commercial algorithm meant to identify “high-risk” patients for extra care was deeply biased. The designers used healthcare spending as a proxy for health needs.

Because of systemic barriers, Black patients with the same chronic conditions as white patients had lower historical healthcare spending. The result? The algorithm consistently ranked healthier white patients as “higher risk” than sicker Black patients. The data was “accurate” regarding dollars spent, but the design choice to equate cost with need made the system a tool for inequity.

Why Design Choices Matter in Automation

In automated healthcare—from patient self-scheduling to AI-driven triage—the user interface (UI) and the underlying logic are where equity is won or lost.

Traditional automated systems often flag patients who miss appointments as non-compliant. This label then follows the patient in their Electronic Health Record (EHR), influencing how future providers perceive them.

Designing for health equity means changing the way data is captured. Instead of a binary Attended/No-Show, an equitable system asks why. Did the automated SMS reminder arrive in a language the patient speaks? Was the appointment scheduled at a time when public transit isn’t running? When systems are designed to capture the why, it moves from penalizing patients to solving structural barriers.

Automation relies on proxies. Since a computer can’t see a patient’s pain or struggle, it looks for related data points.

Standard Design: Uses the Number of visits as a proxy for Urgency.

Equity-Centred Design: Recognizes that a patient with three jobs and no childcare might have zero visits despite high urgency.

Case Study: The Pulse Oximeter and the Skin Tone Gap

Perhaps the most visceral example of design-stage failure is the pulse oximeter. For years, these devices—standard in every hospital—were designed and calibrated primarily on lighter skin tones.

During the COVID-19 pandemic, research highlighted that these devices were significantly more likely to overestimate oxygen levels in patients with darker skin. This wasn’t a glitch; it was a design choice in the R&D phase that failed to prioritize a diverse range of skin pigmentations. 

Because the data provided to clinicians was technically recorded but biologically inaccurate for a portion of the population, Black and Hispanic patients faced delays in receiving life-saving oxygen therapy and steroids.

Closing the Gaps: Strategies for Designing for Health Equity

To move beyond awareness and into action, healthcare leaders and technology partners must rethink the architecture of their decision-making.

Equity GapDesign-Led Solution
Data Under-representationMandate diverse datasets that include rural, low-income, and multi-ethnic populations in the training phase.
Digital RedliningEnsure automated tools work on low-bandwidth connections and older smartphone models, not just the latest tech.
Algorithmic BiasReplace “cost” or “utilization” proxies with clinical severity and social vulnerability indices (SVI).
Linguistic IsolationMove beyond “Google Translate” and design native-language interfaces that respect cultural nuance.

The Final Word

To achieve true health equity, stop asking “What does the data say?” and start asking:

Who is missing from this dataset?

What assumptions were made when choosing this metric?

How will this automated decision impact a patient with the least amount of resources?

Automation should be a bridge, not a barrier. By intentionally designing for health equity, it ensures that the efficiency of a system never comes at the cost of a patient’s dignity or access to care.

Healthcare data is a powerful tool, but it is only as equitable as the people—and the choices—behind it. It’s time to stop blaming the numbers and start fixing the design.

Related articles