Incident: Algorithm Bias in Healthcare Prioritization Software disadvantaging Black Patients.

Published Date: 2019-10-24

Postmortem Analysis
Timeline 1. The software failure incident mentioned in the article happened in 2019. - The article was published on 2019-10-24 [90342].
System 1. Algorithm used in the care of 70 million patients, potentially from Optum, a subsidiary of UnitedHealth [90342]
Responsible Organization 1. The algorithm provider, a subsidiary of an insurance company, potentially Optum owned by UnitedHealth, responsible for developing the biased algorithm [90342].
Impacted Organization 1. Black patients at a major US hospital [90342]
Software Causes 1. The failure incident was caused by an algorithm used by a major US hospital that systematically privileged white patients over black patients in determining eligibility for special care programs [90342].
Non-software Causes 1. Socioeconomic inequalities leading to disparities in access to healthcare based on income levels and societal factors [90342].
Impacts 1. The algorithm used in a major US hospital favored white patients over black patients, leading to a significant bias in selecting patients for extra care programs, resulting in a reduction of black patients receiving extra help by more than half [90342]. 2. The biased algorithm effectively excluded black patients from receiving necessary care, potentially increasing their chances of emergency room visits and hospital stays [90342]. 3. The skewed performance of the algorithm highlighted how even race-neutral formulas can have discriminatory effects when relying on data reflecting societal inequalities [90342]. 4. The software's design to predict future health costs as a proxy for health needs replicated unevenness in access to healthcare, showcasing the hazards of combining optimizing algorithms with data reflecting social realities [90342]. 5. The algorithm's focus on predicting costs rather than actual health needs disadvantaged black patients who tend to have lower incomes, leading to a skewed patient prioritization system [90342].
Preventions 1. Implementing thorough and regular audits of the algorithm's performance to identify biases and discriminatory effects [Article 90342]. 2. Ensuring that the algorithm takes into account relevant factors beyond just future health costs, such as the frequency of chronic condition flare-ups, to provide a more holistic assessment of patients' health needs [Article 90342]. 3. Developing and testing alternative versions of the algorithm that reduce disparities between different patient groups, such as the approach that combined future costs and flare-up predictions, which significantly reduced the skew between white and black patients [Article 90342]. 4. Enforcing regulations or guidelines that oversee the development and deployment of patient prioritization software in healthcare to prevent discriminatory outcomes and ensure fair treatment for all patients [Article 90342].
Fixes 1. Implementing algorithms that take into account a combination of a patient's future costs and the number of times a chronic condition will flare up over the next year, reducing the skew between white patients and black patients by more than 80 percent [90342]. 2. Addressing the deeper causes of health inequalities through policies such as improved family leave, working conditions, and more flexible clinic hours to ensure equitable access to healthcare for all individuals [90342].
References 1. Researchers from UC Berkeley, University of Chicago, Brigham and Women’s and Massachusetts General hospitals in Boston [90342] 2. Optum, a subsidiary of an insurance company, possibly owned by UnitedHealth [90342] 3. Linda Goler Blount, president and CEO of nonprofit the Black Women’s Health Imperative [90342] 4. Researchers at Santa Clara University and Virginia Commonwealth University [90342]

Software Taxonomy of Faults

Category Option Rationale
Recurring unknown a) The software failure incident related to biased algorithms favoring white patients over black patients in healthcare has happened at one organization. The incident involved an algorithm used by a large academic hospital in the US that systematically privileged white patients over black patients for special care programs [90342]. The algorithm, developed by a subsidiary of an insurance company, was used in the care of 70 million patients and was confirmed to have biased judgments by favoring white patients over black patients with similar health burdens. The company behind the algorithm has acknowledged the issue and is working to address it [90342]. b) There is no specific information in the articles indicating that a similar software failure incident has happened at multiple organizations. The focus of the incident discussed in the articles is on the algorithm used by a particular hospital and the implications of biased algorithms in healthcare.
Phase (Design/Operation) design, operation (a) The software failure incident in the article is related to the design phase. The algorithm used by a major US hospital to guide care for patients with complex health needs systematically privileged white patients over black patients. The algorithm, developed by a subsidiary of an insurance company, effectively let whites cut in line for special programs for patients with chronic conditions such as diabetes or kidney problems. The algorithm's bias reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent, leading to stark differences in outcomes [90342]. (b) The software failure incident is also related to the operation phase. The algorithm used by the hospital automatically enrolled patients above certain risk scores into the program or referred them for consideration by doctors. However, the algorithm's skewed performance resulted in excluding black patients from the extra care program on the basis of race, potentially leading to greater chances of emergency room visits and hospital stays for those missing out on extra care [90342].
Boundary (Internal/External) within_system The software failure incident described in the article falls under the category of within_system failure. The failure was due to contributing factors that originated from within the system itself, specifically from the biased algorithm used by the hospital to guide care for patients [90342]. The algorithm, which was designed to predict patients' future health costs as a proxy for their health needs, ended up systematically privileging white patients over black patients, leading to discriminatory effects within the system. The biased performance of the algorithm highlights how even supposedly race-neutral formulas can have discriminatory impacts when they rely on data that reflects societal inequalities.
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident in the article is related to non-human actions, specifically the bias in the algorithm used by a hospital to prioritize patients for special care programs. The algorithm systematically privileged white patients over black patients, leading to discriminatory effects in patient care allocation [90342]. The bias in the algorithm was not intentional but stemmed from the data it relied on, reflecting societal inequalities. The software failure was a result of the algorithm's design and the data it used, rather than any direct human actions to introduce bias. (b) However, human actions were involved in addressing the software failure incident. Researchers who discovered the bias in the algorithm worked with the company behind the software to address the issue. The company confirmed the problem and was working to rectify it [90342]. Additionally, the researchers collaborated with the algorithm's provider to test a modified version that significantly reduced the bias between white and black patients [90342]. This collaboration between researchers and the software provider highlights human actions taken to mitigate the software failure caused by non-human actions.
Dimension (Hardware/Software) software (a) The articles do not provide information about a software failure incident occurring due to contributing factors originating in hardware. (b) The software failure incident discussed in the articles is related to bias in an algorithm used in healthcare. The algorithm systematically privileged white patients over black patients, leading to discriminatory effects in patient care selection [90342].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident described in the article is non-malicious. The failure was due to the algorithm used in a hospital's care program favoring white patients over black patients, resulting in discriminatory effects. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, leading to potential negative outcomes for those excluded from the program [90342]. The incident highlights how even supposedly race-neutral algorithms can have discriminatory effects when they rely on data that reflects societal inequalities.
Intent (Poor/Accidental Decisions) poor_decisions The software failure incident described in the article is related to poor_decisions. The incident involved an algorithm used by a major US hospital that systematically privileged white patients over black patients, leading to biased decisions in selecting patients for special care programs based on race [90342]. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent, resulting in stark differences in outcomes for black patients compared to white patients with similar health burdens. The skewed judgments produced by the algorithm highlight the discriminatory effects that can arise when algorithms lean on data reflecting inequalities in society, ultimately leading to poor decisions in patient prioritization based on race.
Capability (Incompetence/Accidental) accidental (a) The software failure incident described in the article is not related to development incompetence. Instead, it highlights a case where the algorithm used in a hospital's care program systematically privileged white patients over black patients due to biased data and design flaws in the algorithm [90342]. (b) The software failure incident can be categorized as accidental, as the biased outcomes were not intentional but rather a result of the algorithm's reliance on data that reflected societal inequalities, leading to discriminatory effects [90342].
Duration temporary The software failure incident described in the article is more of a temporary nature rather than permanent. The incident was related to an algorithm used by a hospital to guide care for patients, which systematically privileged white patients over black patients. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent. The company behind the algorithm has acknowledged the issue and is working to address it, indicating that the failure was due to specific circumstances and factors introduced by the algorithm's design and implementation [90342].
Behaviour value (a) crash: The software failure incident described in the article does not involve a crash where the system loses state and does not perform any of its intended functions. The incident is related to bias in an algorithm used in healthcare decision-making [90342]. (b) omission: The software failure incident does not involve omission where the system omits to perform its intended functions at an instance(s). Instead, the incident is about the algorithm systematically privileging white patients over black patients, leading to discriminatory effects in patient care [90342]. (c) timing: The software failure incident is not related to timing issues where the system performs its intended functions correctly but too late or too early. The focus of the incident is on the biased behavior of the algorithm in patient prioritization [90342]. (d) value: The software failure incident is related to the system performing its intended functions incorrectly. The algorithm used in healthcare decision-making was found to effectively let whites cut in line for special programs for patients with complex, chronic conditions, disadvantaging black patients with similar health burdens [90342]. (e) byzantine: The software failure incident does not exhibit byzantine behavior where the system behaves erroneously with inconsistent responses and interactions. The issue in this case is the biased outcomes produced by the algorithm in patient care prioritization [90342]. (f) other: The behavior of the software failure incident can be categorized as a failure due to systemic bias in the algorithm, leading to discriminatory effects in patient care prioritization. The incident highlights the challenges of using algorithms in healthcare decision-making without considering the potential biases that can arise from the data used to train these systems [90342].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence harm, theoretical_consequence The consequence of the software failure incident described in the articles is related to potential harm and theoretical consequences: - Harm: The software failure incident resulted in the potential harm of patients, particularly black patients, as they were disadvantaged in receiving extra care due to the biased algorithm. This could lead to increased chances of emergency room visits and hospital stays for those who were excluded from the program [90342]. - Theoretical Consequence: There were discussions about potential consequences of the software failure incident, such as the replication of inequalities in access to healthcare, the impact on patient prioritization, and the need for policies to address deeper causes of health inequalities [90342].
Domain health (a) The failed system in the article was related to the healthcare industry, specifically in the context of patient care prioritization based on an algorithm used by a major US hospital [90342]. (j) The software failure incident was directly related to the health industry, as it involved an algorithm used by a hospital to guide care for patients with complex health needs [90342].

Sources

Back to List