Incident: Unreliable Algorithms Delay Welfare Payments in UK Councils

Published Date: 2019-10-15

Postmortem Analysis
Timeline 1. The software failure incident involving North Tyneside council dropping TransUnion's system for benefit claims occurred last month [91514]. 2. The incident involving Hackney council dropping Xantura from a project to predict child abuse happened last year [91514]. 3. Sunderland city council not renewing a data analytics contract with Palantir was part of a plan in 2014 [91514]. Therefore, based on the information provided in the articles: - The software failure incident involving North Tyneside council occurred in September 2019. - The incident with Hackney council happened in 2018. - Sunderland city council's decision not to renew the contract with Palantir was part of a plan in 2014.
System 1. TransUnion's system for checking housing and council tax benefit claims [91514] 2. Xantura's predictive model for predicting child abuse [91514] 3. Palantir's "intelligence hub" for data analysis [91514]
Responsible Organization 1. TransUnion's system erroneously delaying welfare payments due to wrongly identifying low-risk claims as high risk [91514]. 2. Xantura's predictive model failing to provide sufficiently useful insights due to issues of variable data quality [91514]. 3. Palantir's "intelligence hub" not meeting expectations and not providing expected benefits to Sunderland council [91514].
Impacted Organization 1. North Tyneside council - experienced delays in welfare payments due to erroneous identification of low-risk claims as high risk by TransUnion's system [Article 91514] 2. Hackney council - faced issues with the Xantura predictive model in predicting child abuse and had to drop the company from the project [Article 91514] 3. Sunderland city council - did not renew a data analytics contract with Palantir for an "intelligence hub" due to reasons related to efficiency savings and variable data quality [Article 91514]
Software Causes 1. The software failure incident was caused by unreliable computer algorithms used by councils to make decisions about benefit claims and welfare issues [91514].
Non-software Causes 1. Lack of understanding and oversight by the leadership of local authorities regarding the complexity of the systems being used [91514]. 2. Issues with data quality affecting the system's ability to provide useful insights [91514]. 3. Inadequate access to regular updates of source data for driving the predictive models [91514].
Impacts 1. Welfare payments to an unknown number of people were wrongly delayed when TransUnion's system erroneously identified low-risk claims as high risk, leading to benefit claims being delayed [91514]. 2. Hackney council in east London dropped Xantura from a project to predict child abuse as the system did not deliver the expected benefits, impacting the council's ability to intervene before incidents occur [91514]. 3. Sunderland city council did not renew a £4.5m data analytics contract with Palantir for an "intelligence hub," affecting the council's ability to analyze data for various programs and potentially impacting efficiency savings [91514].
Preventions 1. Implementing thorough testing and validation procedures before deploying the software systems could have prevented the software failure incident [91514]. 2. Providing adequate training and understanding to council officials on how the systems work could have helped in identifying and addressing issues early on [91514]. 3. Ensuring transparency and accountability in the decision-making process involving automated systems could have prevented erroneous decisions and delays in benefit claims [91514]. 4. Regularly updating and maintaining the source data used by the predictive models to ensure the accuracy and reliability of the analytics could have mitigated risks associated with outdated information [91514].
Fixes 1. Ensuring transparency and explainability of the algorithms used by local authorities to make decisions about benefit claims and welfare issues [91514]. 2. Regularly updating source data to drive the predictive models accurately [91514]. 3. Conducting thorough reviews and evaluations of the software systems being used by councils to identify and address any issues related to data quality, reliability, and effectiveness [91514]. 4. Balancing the benefits of using algorithms with potential risks and harms, such as privacy and data security concerns, stigmatization of communities, and unwanted intrusion by services [91514]. 5. Demanding trustworthy and transparent explanations of how the systems work, why they come to specific conclusions about individuals, fairness, and practical effectiveness [91514].
References 1. The Guardian [91514]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident having happened again at one_organization: - North Tyneside council had to drop TransUnion's system used for checking housing and council tax benefit claims due to erroneous predictive analytics causing delays in welfare payments [91514]. - Hackney council in east London also dropped Xantura's system from a project to predict child abuse, stating that it did not deliver the expected benefits [91514]. (b) The software failure incident having happened again at multiple_organization: - Sunderland city council did not renew a £4.5m data analytics contract for an "intelligence hub" provided by Palantir due to reasons not disclosed [91514]. - Concerns have been raised about the reliability and effectiveness of the software systems provided by various companies like Experian, TransUnion, Capita, and Palantir to local authorities for automated decision-making on benefit claims, child abuse prediction, and other welfare issues [91514].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where North Tyneside council dropped TransUnion's system used for housing and council tax benefit claims. The system's "predictive analytics" erroneously identified low-risk claims as high risk, leading to welfare payments being wrongly delayed. The council report highlighted that the system provided no reason for cases meeting a high-risk category, and the reasons for being high risk could not be established, causing delays in benefit claims [91514]. (b) The software failure incident related to the operation phase is evident in the case of Hackney council dropping Xantura from a project to predict child abuse. The council stated that the system did not deliver the expected benefits due to issues of variable data quality, indicating a failure in the operation or utilization of the system for providing useful insights [91514].
Boundary (Internal/External) within_system (a) The software failure incident reported in the articles can be categorized as within_system. The incidents mentioned, such as the erroneous identification of low-risk claims as high risk by TransUnion's system leading to delayed welfare payments [91514], the inability of Xantura's system to provide sufficiently useful insights due to variable data quality [91514], and the lack of reliable predictive models due to issues with accessing regular updates of source data [91514], all point to failures originating from within the system itself. These issues highlight the challenges and limitations of the algorithms and machine-learning systems being used by councils for decision-making processes.
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: The article reports incidents where software systems provided erroneous results without human involvement. For example, in the case of North Tyneside council using TransUnion's system, the "predictive analytics" erroneously identified low-risk claims as high risk, leading to wrongly delayed welfare payments [91514]. This indicates a failure caused by non-human actions, specifically the inaccurate predictions made by the software system. (b) The software failure incident occurring due to human actions: The article also highlights instances where human actions contributed to software failures. For instance, Hackney council in east London dropped Xantura from a project to predict child abuse due to the system not delivering the expected benefits [91514]. This decision to discontinue the project was a result of human actions taken based on the system's performance, indicating a failure influenced by human decisions.
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The article mentions that North Tyneside council experienced a software failure incident where welfare payments to an unknown number of people were wrongly delayed due to the computer's "predictive analytics" erroneously identifying low-risk claims as high risk. This incident was attributed to TransUnion's system, which automatically processed data about claimants for housing and council tax benefit to determine the likelihood of fraud, indicating a hardware-related failure [91514]. (b) The software failure incident related to software: - The article highlights that Hackney council in east London dropped Xantura from a project to predict child abuse and intervene before it happens, stating that Xantura's system did not deliver the expected benefits. This failure was attributed to the software provided by Xantura, indicating a software-related failure [91514].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident described in the articles does not seem to be malicious, as there is no indication that the failure was due to contributing factors introduced by humans with the intent to harm the system. The incidents mentioned, such as delays in benefit claims and lack of expected benefits in predicting child abuse, appear to be a result of non-malicious factors like unreliable algorithms, lack of understanding of the systems, and issues with data quality [91514].
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident related to poor decisions can be seen in the article where it mentions that some councils invested in software contracts without fully understanding how the systems work. Gwilym Morris, a management consultant, highlighted that the complexity of the systems meant that the leadership of local authorities "don’t really understand what is going on" [Article 91514]. This lack of understanding raises questions about how citizens' data was used and indicates poor decision-making in implementing these software systems. (b) The intent of the software failure incident related to accidental decisions is evident in the case of North Tyneside council using TransUnion's system for benefit claims. The system erroneously identified low-risk claims as high risk, leading to welfare payments being wrongly delayed. The council report concluded that most cases deemed high risk by the software were actually lower risk, and there was no reason for the payment to be withheld, but claims were delayed due to the system's errors [Article 91514]. This incident reflects a failure caused by unintended decisions or mistakes within the software system.
Capability (Incompetence/Accidental) development_incompetence (a) The software failure incident related to development incompetence is evident in the case of North Tyneside council using TransUnion's system for benefit claims. The system erroneously identified low-risk claims as high risk, leading to delays in welfare payments to individuals. The council report highlighted that the software provided no reason for cases meeting a high-risk category, and the reasons for being classified as high risk could not be established, causing delays in benefit claims [91514]. (b) The software failure incident related to accidental factors is seen in the case of Hackney council dropping Xantura from a project to predict child abuse. The council mentioned that the system did not deliver the expected benefits, indicating that the failure was not intentional but rather a result of the system's inability to provide useful insights due to issues of variable data quality [91514].
Duration temporary The articles do not provide specific information about the duration of the software failure incident in terms of being permanent or temporary. The incidents mentioned in the articles highlight issues with the reliability and effectiveness of the software systems being used by councils for various purposes, such as benefit claims and child abuse prediction. These incidents point towards failures caused by contributing factors introduced by certain circumstances, such as inaccuracies in predictive analytics leading to delays in benefit payments and the inability of the system to provide useful insights for child abuse prediction [91514].
Behaviour crash, omission, value, other (a) crash: The software failure incident related to the crash behavior is exemplified by the case of North Tyneside council dropping TransUnion's system, which erroneously delayed welfare payments to individuals due to the system's "predictive analytics" wrongly identifying low-risk claims as high risk, leading to benefit claims being wrongly delayed [91514]. (b) omission: The software failure incident related to the omission behavior is demonstrated by Hackney council dropping Xantura from a project to predict child abuse, stating that the system did not deliver the expected benefits, indicating an omission in providing useful insights [91514]. (c) timing: The software failure incident related to the timing behavior is not explicitly mentioned in the provided article. (d) value: The software failure incident related to the value behavior is illustrated by the case of North Tyneside council's system by TransUnion incorrectly classifying cases as high risk when they were actually lower risk, resulting in benefit claims being wrongly delayed, indicating a failure in performing the intended functions correctly [91514]. (e) byzantine: The software failure incident related to the byzantine behavior is not explicitly mentioned in the provided article. (f) other: The software failure incident related to other behavior includes concerns raised about privacy and data security, the difficulty for citizens in challenging automated decisions, and the lack of understanding by council officials on how some of the systems work, raising questions about how citizens' data is used [91514].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence basic, property, delay, other (a) death: There is no mention of any deaths resulting from the software failure incident reported in the articles [91514]. (b) harm: The software failure incident did not result in physical harm to individuals [91514]. (c) basic: The software failure incident did impact people's access to welfare benefits, as some benefit claims were wrongly delayed due to erroneous identification by the computer system [91514]. (d) property: People's welfare payments were impacted as a result of the software failure incident, with benefit claims being wrongly delayed [91514]. (e) delay: The software failure incident caused delays in benefit payments to individuals due to the erroneous identification of low-risk claims as high risk by the computer system [91514]. (f) non-human: The software failure incident did not mention any impact on non-human entities [91514]. (g) no_consequence: The software failure incident did have real observed consequences, such as delays in benefit payments and the need for councils to discontinue contracts with certain software providers [91514]. (h) theoretical_consequence: There were potential consequences discussed, such as concerns about privacy, data security, and the ability for citizens to challenge automated decisions, but these were not explicitly mentioned as occurring as a result of the software failure incident [91514]. (i) other: The software failure incident led to concerns about the increasing use of algorithms leaving vulnerable people at the whim of automated decisions they cannot challenge, raising questions about how citizens' data was used [91514].
Domain information, government (a) The failed system was intended to support the government sector. The software contracts in question were being used by local authorities (councils) to make decisions about benefit claims, prevent child abuse, allocate school places, and analyze data for various programs such as the Troubled Families program [Article 91514]. The article specifically mentions examples of councils like North Tyneside, Hackney, and Sunderland using different software systems provided by companies like TransUnion, Xantura, and Palantir for these government-related purposes.

Sources

Back to List