Recurring |
unknown |
a) The software failure incident related to biased algorithms favoring white patients over black patients in healthcare has happened at one organization. The incident involved an algorithm used by a large academic hospital in the US that systematically privileged white patients over black patients for special care programs [90342]. The algorithm, developed by a subsidiary of an insurance company, was used in the care of 70 million patients and was confirmed to have biased judgments by favoring white patients over black patients with similar health burdens. The company behind the algorithm has acknowledged the issue and is working to address it [90342].
b) There is no specific information in the articles indicating that a similar software failure incident has happened at multiple organizations. The focus of the incident discussed in the articles is on the algorithm used by a particular hospital and the implications of biased algorithms in healthcare. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident in the article is related to the design phase. The algorithm used by a major US hospital to guide care for patients with complex health needs systematically privileged white patients over black patients. The algorithm, developed by a subsidiary of an insurance company, effectively let whites cut in line for special programs for patients with chronic conditions such as diabetes or kidney problems. The algorithm's bias reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent, leading to stark differences in outcomes [90342].
(b) The software failure incident is also related to the operation phase. The algorithm used by the hospital automatically enrolled patients above certain risk scores into the program or referred them for consideration by doctors. However, the algorithm's skewed performance resulted in excluding black patients from the extra care program on the basis of race, potentially leading to greater chances of emergency room visits and hospital stays for those missing out on extra care [90342]. |
Boundary (Internal/External) |
within_system |
The software failure incident described in the article falls under the category of within_system failure. The failure was due to contributing factors that originated from within the system itself, specifically from the biased algorithm used by the hospital to guide care for patients [90342]. The algorithm, which was designed to predict patients' future health costs as a proxy for their health needs, ended up systematically privileging white patients over black patients, leading to discriminatory effects within the system. The biased performance of the algorithm highlights how even supposedly race-neutral formulas can have discriminatory impacts when they rely on data that reflects societal inequalities. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident in the article is related to non-human actions, specifically the bias in the algorithm used by a hospital to prioritize patients for special care programs. The algorithm systematically privileged white patients over black patients, leading to discriminatory effects in patient care allocation [90342]. The bias in the algorithm was not intentional but stemmed from the data it relied on, reflecting societal inequalities. The software failure was a result of the algorithm's design and the data it used, rather than any direct human actions to introduce bias.
(b) However, human actions were involved in addressing the software failure incident. Researchers who discovered the bias in the algorithm worked with the company behind the software to address the issue. The company confirmed the problem and was working to rectify it [90342]. Additionally, the researchers collaborated with the algorithm's provider to test a modified version that significantly reduced the bias between white and black patients [90342]. This collaboration between researchers and the software provider highlights human actions taken to mitigate the software failure caused by non-human actions. |
Dimension (Hardware/Software) |
software |
(a) The articles do not provide information about a software failure incident occurring due to contributing factors originating in hardware.
(b) The software failure incident discussed in the articles is related to bias in an algorithm used in healthcare. The algorithm systematically privileged white patients over black patients, leading to discriminatory effects in patient care selection [90342]. |
Objective (Malicious/Non-malicious) |
non-malicious |
(a) The software failure incident described in the article is non-malicious. The failure was due to the algorithm used in a hospital's care program favoring white patients over black patients, resulting in discriminatory effects. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, leading to potential negative outcomes for those excluded from the program [90342]. The incident highlights how even supposedly race-neutral algorithms can have discriminatory effects when they rely on data that reflects societal inequalities. |
Intent (Poor/Accidental Decisions) |
poor_decisions |
The software failure incident described in the article is related to poor_decisions. The incident involved an algorithm used by a major US hospital that systematically privileged white patients over black patients, leading to biased decisions in selecting patients for special care programs based on race [90342]. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent, resulting in stark differences in outcomes for black patients compared to white patients with similar health burdens. The skewed judgments produced by the algorithm highlight the discriminatory effects that can arise when algorithms lean on data reflecting inequalities in society, ultimately leading to poor decisions in patient prioritization based on race. |
Capability (Incompetence/Accidental) |
accidental |
(a) The software failure incident described in the article is not related to development incompetence. Instead, it highlights a case where the algorithm used in a hospital's care program systematically privileged white patients over black patients due to biased data and design flaws in the algorithm [90342].
(b) The software failure incident can be categorized as accidental, as the biased outcomes were not intentional but rather a result of the algorithm's reliance on data that reflected societal inequalities, leading to discriminatory effects [90342]. |
Duration |
temporary |
The software failure incident described in the article is more of a temporary nature rather than permanent. The incident was related to an algorithm used by a hospital to guide care for patients, which systematically privileged white patients over black patients. The algorithm's bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent. The company behind the algorithm has acknowledged the issue and is working to address it, indicating that the failure was due to specific circumstances and factors introduced by the algorithm's design and implementation [90342]. |
Behaviour |
value |
(a) crash: The software failure incident described in the article does not involve a crash where the system loses state and does not perform any of its intended functions. The incident is related to bias in an algorithm used in healthcare decision-making [90342].
(b) omission: The software failure incident does not involve omission where the system omits to perform its intended functions at an instance(s). Instead, the incident is about the algorithm systematically privileging white patients over black patients, leading to discriminatory effects in patient care [90342].
(c) timing: The software failure incident is not related to timing issues where the system performs its intended functions correctly but too late or too early. The focus of the incident is on the biased behavior of the algorithm in patient prioritization [90342].
(d) value: The software failure incident is related to the system performing its intended functions incorrectly. The algorithm used in healthcare decision-making was found to effectively let whites cut in line for special programs for patients with complex, chronic conditions, disadvantaging black patients with similar health burdens [90342].
(e) byzantine: The software failure incident does not exhibit byzantine behavior where the system behaves erroneously with inconsistent responses and interactions. The issue in this case is the biased outcomes produced by the algorithm in patient care prioritization [90342].
(f) other: The behavior of the software failure incident can be categorized as a failure due to systemic bias in the algorithm, leading to discriminatory effects in patient care prioritization. The incident highlights the challenges of using algorithms in healthcare decision-making without considering the potential biases that can arise from the data used to train these systems [90342]. |