Recurring |
one_organization, multiple_organization |
(a) The software failure incident related to the misuse of facial recognition technology has happened again within the same organization, specifically with the Metropolitan police in the UK. The incident involved the use of facial recognition software that misidentified individuals, leading to wrongful stops and potential violations of human rights laws [87963].
(b) Additionally, similar incidents involving the misuse of facial recognition technology have occurred at other organizations as well. The article mentions that similar facial scanning software is being used in shopping centers and by other police forces in Manchester, Leicester, and South Wales. These organizations are also using facial recognition technology to identify individuals, including "persons of interest" and wanted criminals, raising concerns about privacy and potential biases in the technology [87963]. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase is evident in the case of the facial recognition software used by the police. The article mentions that the system regularly misidentified people, leading to wrongful stops, and warned of "surveillance creep" where the technology was used to find individuals not wanted by the courts [87963]. Additionally, the research found that police were too hasty to stop people before matches could be properly checked, leading to mistakes, and that watchlists were sometimes out of date and included individuals considered "at risk or vulnerable" [87963].
(b) The software failure incident related to the operation phase is highlighted by the fact that during the trials of the facial recognition system, of the 42 people flagged up, only 8 were actually being sought by the police. Some individuals were stopped for crimes that had already been dealt with by the courts but were arrested for more minor offenses that would not typically be considered serious enough to be tackled using facial recognition [87963]. This indicates a failure in the operation of the system, leading to unnecessary stops and potential wrongful arrests. |
Boundary (Internal/External) |
within_system |
(a) within_system: The software failure incident related to the use of facial recognition software by the police was primarily due to contributing factors that originated from within the system itself. The article mentions that the system regularly misidentified people, leading to wrongful stops, and that the watchlists used were sometimes out of date and included individuals wanted by the courts as well as those considered "at risk or vulnerable" [87963]. Additionally, the research highlighted that police were too hasty to stop people before matches could be properly checked, leading to mistakes, and that officers viewed the technology as a way of detecting and deterring crime, which could have been achieved without biometric technology [87963]. These issues point to failures within the system itself, such as inaccuracies in identification and outdated or flawed watchlists, contributing to the software failure incident. |
Nature (Human/Non-human) |
non-human_actions |
(a) The software failure incident in the news articles is primarily related to non-human actions. The failure is attributed to the facial recognition software's inability to correctly identify individuals, leading to mistaken stops and potential violations of human rights laws. The system's misidentification of people, outdated watchlists, and the potential for bias in identifying individuals are key factors contributing to the failure [87963]. Additionally, the software's deployment in various public spaces and its use for surveillance purposes without proper oversight and regulations further exacerbate the risks associated with non-human actions leading to software failure. |
Dimension (Hardware/Software) |
software |
(a) The software failure incident related to hardware:
- The article does not mention any specific hardware-related failures contributing to the software failure incident. It primarily focuses on the issues with facial recognition software misidentifying individuals and the concerns related to human rights violations and potential biases in the technology [87963].
(b) The software failure incident related to software:
- The software failure incident in this case is primarily attributed to the facial recognition software misidentifying individuals, leading to wrongful stops and potential miscarriages of justice. The software was found to regularly misidentify people, have out-of-date watchlists, and be used in ways that could potentially violate human rights laws. The article highlights concerns about the software's accuracy, potential biases, and the lack of transparency surrounding its use [87963]. |
Objective (Malicious/Non-malicious) |
non-malicious |
(a) The software failure incident related to the use of facial recognition software by the police can be categorized as non-malicious. The incident involved failures in the accuracy and effectiveness of the facial recognition system, leading to misidentifications, wrongful stops, and potential biases in identifying individuals. The failure was attributed to factors such as the technology being prone to false positives and false negatives, especially with noisy imagery, and the system's tendency to wrongly identify dark-skinned women more often than light-skinned men [87963]. The concerns raised by academics and civil rights groups focused on the potential harm caused by the technology's inaccuracies and intrusiveness, rather than any malicious intent behind the failures. |
Intent (Poor/Accidental Decisions) |
poor_decisions |
(a) The intent of the software failure incident:
The software failure incident related to the use of facial recognition software by police was primarily due to poor decisions made in implementing the technology. The article highlights that the system regularly misidentified people, leading to wrongful stops, and warned of "surveillance creep" where the technology was used to find individuals not wanted by the courts. Additionally, concerns were raised about potential bias in the software, particularly in wrongly identifying dark-skinned women. The research by academics from the University of Essex emphasized that the police were too hasty to stop people before matches could be properly checked, leading to mistakes, and that watchlists were sometimes out of date and included individuals considered "at risk or vulnerable" along with those wanted by the courts [87963]. These issues point to failures resulting from poor decisions made in the implementation and use of the facial recognition software. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident related to development incompetence is evident in the case of the facial recognition software used by the police. The independent analysis conducted by academics from the University of Essex revealed that the system had a low accuracy rate, with matches being correct only in a fifth of cases. The researchers found that the system regularly misidentified people, leading to wrongful stops and potential miscarriages of justice [87963].
(b) The software failure incident related to accidental factors is demonstrated by the mistakes made by the police in stopping individuals based on faulty identifications by the facial recognition software. The hasty actions taken by officers before matches could be properly checked resulted in errors, with some individuals being stopped for crimes that had already been dealt with by the courts or for minor offenses not serious enough to warrant such actions. This indicates accidental failures in the implementation and use of the technology [87963]. |
Duration |
temporary |
The software failure incident related to the use of facial recognition software by the police can be categorized as a temporary failure. The incident involved issues such as misidentifying individuals, wrongly stopping people, outdated watchlists, potential bias in identification, and hasty actions leading to mistakes [87963]. These issues were specific to the circumstances surrounding the deployment and operation of the facial recognition software by the police. The failure was not inherent to the software itself but rather to the implementation and usage of the technology in real-world scenarios. |
Behaviour |
omission, value, other |
(a) crash: The software failure incident related to the facial recognition system used by the police did not involve a crash where the system lost state and did not perform any of its intended functions. The system was operational and actively scanning faces against a watchlist, although with significant issues in accuracy and potential human rights violations [87963].
(b) omission: The failure of the facial recognition software incident can be attributed to omission as the system was misidentifying people, leading to wrongful stops and potential miscarriages of justice. The system was omitting to perform its intended function of accurately matching faces against the watchlist, resulting in mistakes and negative consequences [87963].
(c) timing: The software failure incident was not primarily related to timing issues where the system performed its intended functions correctly but too late or too early. Instead, the focus was on the accuracy and potential misuse of the facial recognition technology, raising concerns about privacy, human rights violations, and wrongful arrests [87963].
(d) value: The failure of the facial recognition software incident can be categorized under the value option, where the system was performing its intended functions incorrectly. The system was misidentifying individuals, leading to wrongful stops and arrests, as well as potential biases in the identification process [87963].
(e) byzantine: The software failure incident did not exhibit characteristics of a byzantine failure where the system behaves erroneously with inconsistent responses and interactions. The main issues were related to accuracy, privacy concerns, potential biases, and human rights violations associated with the facial recognition technology [87963].
(f) other: The software failure incident can be described as a failure due to ethical and legal concerns surrounding the use of the facial recognition system. The system's behavior raised questions about privacy, freedom of expression, potential biases, and the risk of wrongful arrests, highlighting broader societal implications beyond technical malfunctions [87963]. |