Recurring |
one_organization, multiple_organization |
(a) The software failure incident having happened again at one_organization:
- The article mentions that the Home Office recently stopped using an algorithm to help decide visa applications after allegations that it contained "entrenched racism" [103566]. This indicates a software failure incident within the Home Office related to the algorithm used for visa applications.
(b) The software failure incident having happened again at multiple_organization:
- The article discusses how the use of artificial intelligence or automated decision-making has come into sharp focus after an algorithm used by the exam regulator Ofqual downgraded almost 40% of A-level grades assessed by teachers, leading to a humiliating government U-turn and the system being scrapped [103566].
- It also mentions that the Data Justice Lab found at least two other councils had stopped using a risk-based verification system due to various issues [103566].
- Additionally, the article highlights that police forces are increasingly experimenting with the use of artificial intelligence or automated decision-making, with some forces using or trialling such technologies to help identify crime hotspots [103566]. |
Phase (Design/Operation) |
design, operation |
(a) The article mentions instances where software failures occurred due to contributing factors introduced during the system development phase. For example, the article discusses how the algorithm used by the exam regulator Ofqual led to the downgrading of A-level grades assessed by teachers, resulting in a government U-turn and the system being scrapped [103566]. Additionally, it is highlighted that councils like Sunderland and Hackney abandoned algorithms designed to make efficiency savings or predict risks of neglect and abuse, respectively, due to concerns about negative effects and bias introduced during the development phase [103566].
(b) The article also touches upon software failures resulting from factors introduced during the operation phase. For instance, it is mentioned that about 20 councils stopped using an algorithm to flag claims as "high risk" for potential welfare fraud, as the system led to slowing down claims without the claimants being aware, indicating issues during the operation of the system [103566]. Furthermore, the Home Office stopped using an algorithm to help decide visa applications after allegations of "entrenched racism," suggesting issues related to the operation of the system [103566]. |
Boundary (Internal/External) |
within_system |
(a) within_system:
1. The software failure incident mentioned in the articles is related to the use of computer algorithms and automated decision-making systems within various government bodies, councils, and organizations [103566].
2. The failure was attributed to problems in the way the systems work, concerns about bias, negative effects, and a lack of transparency and consultation with the public before implementing these automated and predictive systems [103566].
3. Specific examples include the algorithm used by the exam regulator Ofqual that downgraded A-level grades, leading to a government U-turn and the system being scrapped, as well as the algorithm used by the Home Office for visa applications that was accused of containing "entrenched racism" and subsequently scrapped [103566].
4. The failure within the system also extended to police forces experimenting with artificial intelligence and automated decision-making, with concerns raised about the lack of public consultation and transparency in using these systems [103566]. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident occurring due to non-human actions:
The incident mentioned in the articles is related to the failure of computer algorithms and automated decision-making systems used by various government bodies and organizations. These failures were attributed to problems in the way the systems work, concerns about bias, negative effects, and the algorithms wrongly identifying low-risk claims as high-risk, among other issues. For example, the algorithm used by the exam regulator Ofqual led to the downgrading of A-level grades assessed by teachers, resulting in a government U-turn and the system being scrapped [103566].
(b) The software failure incident occurring due to human actions:
The articles also highlight instances where human actions contributed to software failures. For example, the Home Office stopped using an algorithm to help decide visa applications after allegations of "entrenched racism," leading to a legal challenge and the system being scrapped. Additionally, concerns were raised about the lack of consultation with the public before implementing automated decision-making systems, which could lead to discrimination and bias. The director of Foxglove emphasized the importance of democratic debate and consultation with the public before implementing such systems to prevent discrimination and bias [103566]. |
Dimension (Hardware/Software) |
software |
(a) The articles do not provide information about a software failure incident occurring due to contributing factors originating in hardware.
(b) The articles discuss software failure incidents related to the use of algorithms and automated decision-making systems in various public services and government bodies. These incidents include the failure of an algorithm used by the exam regulator Ofqual that led to the downgrading of A-level grades, prompting a government U-turn and the system being scrapped [103566]. Additionally, the Home Office stopped using an algorithm to help decide visa applications after allegations of "entrenched racism" and a legal challenge by advocacy groups [103566]. The articles highlight concerns about bias, negative effects, and lack of transparency in the implementation of these automated systems, leading to their cancellation or pause by various councils and government bodies [103566]. |
Objective (Malicious/Non-malicious) |
malicious, non-malicious |
(a) The software failure incident related to the use of computer algorithms in public services, particularly in decision-making related to benefit claims, welfare issues, education, visa applications, and policing, involved malicious intent by human(s) to harm the system. For example, the article mentions that the Home Office stopped using an algorithm to help decide visa applications after allegations of "entrenched racism" [103566]. Additionally, the charity JCWI and the digital rights group Foxglove launched a legal challenge against the system, which was eventually scrapped before going to court [103566]. These instances indicate that the failure was due to contributing factors introduced by humans with the intent to harm the system.
(b) On the other hand, the software failure incident also involved non-malicious factors where problems in the way the systems work, concerns about bias, negative effects, and lack of transparency led to the cancellation of various algorithm programs in government bodies. The article highlights that the reasons for cancelling these programs ranged from problems in the system's functionality to concerns about negative effects and bias [103566]. This indicates that the failure was also influenced by contributing factors introduced without the intent to harm the system. |
Intent (Poor/Accidental Decisions) |
poor_decisions, accidental_decisions |
(a) The articles mention instances where software failure incidents were related to poor decisions made in implementing algorithms or automated decision-making systems. For example, the article discusses how the algorithm used by the exam regulator Ofqual led to the downgrading of A-level grades assessed by teachers, resulting in a humiliating government U-turn and the system being scrapped [103566]. Additionally, it is highlighted that some councils stopped using algorithms for various purposes due to concerns about negative effects, bias, and problems in the way the systems work, indicating poor decisions in implementing such systems [103566].
(b) The articles also touch upon software failure incidents resulting from accidental decisions or unintended consequences. For instance, the Home Office stopped using an algorithm to help decide visa applications after allegations of "entrenched racism," which led to a legal challenge and the system being scrapped before a case went to court [103566]. This incident suggests that unintended consequences or mistakes in the design or implementation of the algorithm contributed to the failure. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The articles mention instances where software failure incidents occurred due to development incompetence. For example, the article discusses how the algorithm used by the exam regulator Ofqual led to the downgrading of almost 40% of A-level grades, resulting in a humiliating government U-turn and the system being scrapped [103566]. Additionally, it is highlighted that some councils stopped using algorithms for various purposes due to problems in the way the systems work, concerns about bias, and negative effects, indicating a lack of professional competence in implementing these systems [103566].
(b) The articles also touch upon software failure incidents that happened accidentally. One such example is the Home Office stopping the use of an algorithm to help decide visa applications after allegations of "entrenched racism" [103566]. This incident suggests that the failure was not intentional but rather a result of unintended consequences within the algorithm. |
Duration |
permanent, temporary |
The software failure incidents mentioned in the articles can be categorized as both temporary and permanent:
Temporary:
1. The article discusses instances where councils and government bodies have stopped using algorithms for various purposes such as identifying welfare fraud, making efficiency savings, and predicting risks in child neglect and abuse [103566]. These instances indicate temporary software failures where the algorithms were implemented but later abandoned due to problems in the way they worked, concerns about bias, negative effects, or lack of consultation with the public.
Permanent:
1. The article highlights the case of the exam regulator Ofqual where an algorithm used to grade A-level exams resulted in almost 40% of grades being downgraded, leading to a government U-turn and the system being scrapped [103566]. This incident represents a more permanent software failure as the system was completely abandoned due to its significant negative impact and public outcry.
Therefore, the software failure incidents discussed in the articles encompass both temporary failures where algorithms were stopped after implementation and permanent failures where systems were completely scrapped due to severe consequences. |
Behaviour |
crash, omission, value, other |
(a) crash: The software failure incident related to the use of computer algorithms in decision-making on benefit claims and welfare issues resulted in a crash. For example, the algorithm used by the exam regulator Ofqual crashed, leading to a government U-turn and the system being scrapped [103566].
(b) omission: The incident also involved omission, where the algorithm wrongly identified low-risk claims as high-risk, leading to potential delays in processing claims without the claimants being aware [103566].
(c) timing: There is no specific mention of a timing-related failure in the articles provided.
(d) value: The failure incident also involved a value-related issue, as the algorithm used by the Home Office to help decide visa applications was alleged to contain "entrenched racism," leading to its scrapping before a legal challenge went to court [103566].
(e) byzantine: There is no specific mention of a byzantine-related failure in the articles provided.
(f) other: The other behavior observed in the software failure incident was the failure to consult with the public and those most affected by the automated decision-making systems before implementing them. This lack of consultation led to concerns about negative effects, bias, and a range of harms caused by the algorithms [103566]. |