Recurring |
one_organization, multiple_organization |
(a) The software failure incident has happened again at one_organization:
The UK government's online passport photo checker, developed by its software vendor, has faced issues related to racial bias and inaccuracies in detecting facial features. Despite an updated version being available for more than a year, the improved software has not been deployed by HM Passport Office [111916]. This indicates a recurring issue within the same organization where the software has not been updated or fixed to address the racial bias problems.
(b) The software failure incident has happened again at multiple_organization:
The incident involving the racially biased online passport photo checker is not unique to the UK government. Similar issues have been reported with facial recognition software used for passport checks, with studies showing biases against individuals with darker skin tones. The software has been criticized for being 'systemically racist' and for making mistakes based on racial features [111916]. This suggests that the problem of racial bias in facial recognition technology is not limited to one organization but is a broader issue affecting multiple organizations utilizing similar software systems. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident in the UK government's online passport photo checker can be attributed to the design phase. The facial detection system, developed by the software vendor, had issues with very light or very dark skin tones, leading to 'racist' mistakes in identifying photo quality [111916]. The system was known to have difficulty with certain skin tones, but officials decided it worked well enough before it went live in 2016. Despite an updated version being available for more than a year, the improved software had not been deployed, indicating a failure in the design and development process [111916].
(b) The software failure incident can also be linked to the operation phase. Users of color faced issues with the system misidentifying features based on their skin color, such as mistaking lips for an open mouth or not finding the outline of the head [111916]. This indicates that the failure was exacerbated by the operation of the system, where users encountered errors and biases during the photo uploading process. Additionally, the system allowed users to override the automated check outcome and submit the photo, suggesting operational challenges in ensuring accurate photo verification [111916]. |
Boundary (Internal/External) |
within_system |
(a) The software failure incident related to the UK government's online passport photo checker can be categorized as within_system. The failure was primarily due to issues within the system itself, specifically with the facial detection technology used in the online passport application service. The system's algorithm had difficulty with very light or very dark skin tones, leading to errors such as mistaking features like open mouths or closed eyes, particularly for individuals of color [111916]. The failure was attributed to the system's lack of diversity in the workplace and an unrepresentative sample of black people, highlighting internal issues with the software's design and testing processes [111916]. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident in the UK government's online passport photo checker was primarily due to non-human actions. The facial detection system developed by the software vendor had inherent biases and difficulties in recognizing features based on skin tones, leading to 'racist' mistakes in the photo checking process [111916].
(b) However, human actions also played a role in the failure incident. Officials decided to deploy the technology despite knowing its issues with very light or very dark skin tones. Additionally, the lack of diversity in the workplace and an unrepresentative sample of black people in the development process contributed to the error [111916]. |
Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident in the UK government's online passport photo checker was primarily due to contributing factors originating in software. The facial detection system developed by the software vendor had issues with very light or very dark skin tones, leading to racist mistakes in the photo checking process [111916].
(b) The software failure incident was also influenced by hardware-related factors. The facial recognition software used for passport checks was found to be 'systemically racist,' as it was twice as likely to reject a picture of a black woman than a white man. This bias in the software's performance indicates a failure originating in the software itself [111916]. |
Objective (Malicious/Non-malicious) |
non-malicious |
(a) The software failure incident related to the UK government's online passport photo checker can be categorized as non-malicious. The failure was not due to any malicious intent but rather stemmed from inherent biases in the facial detection system used for passport photo verification. The system was found to be 'systemically racist' and had difficulty with very light or very dark skin tones, leading to discriminatory outcomes for users of color [111916]. The incident was a result of unintentional biases in the software rather than any deliberate attempt to harm the system. |
Intent (Poor/Accidental Decisions) |
poor_decisions, accidental_decisions |
(a) The software failure incident related to the UK government's online passport photo checker can be attributed to poor decisions made by officials. Despite knowing that the facial detection system had difficulty with very light or very dark skin tones, officials decided it worked well enough before it went live in 2016 [111916]. Additionally, the system was found to be 'systemically racist,' with dark-skinned individuals being disproportionately affected by errors in the photo checking process [111916]. The Home Office acknowledged that the system had issues with very light or very dark skin tones but still judged its overall performance as sufficient to deploy [111916].
(b) The software failure incident can also be linked to accidental decisions or unintended consequences. Users of color received messages such as 'it looks like your mouth is open' or 'we can't find the outline of your head' due to their skin color, indicating unintended biases in the system's algorithms [111916]. The incident where a black man's lips were mistaken for an open mouth by the automated photo checker highlights how unintended biases in the system led to errors [111916]. Additionally, the Race Equality Foundation criticized the lack of proper testing to ensure the system would work for black or ethnic minority people, suggesting that the issues were not intentionally introduced but rather overlooked during development [111916]. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident in the UK government's online passport photo checker was primarily due to development incompetence. The facial detection system developed by the software vendor had issues with racial bias, particularly affecting users of color. The system made mistakes such as wrongly identifying open mouths based on the size of lips and being unable to find the outline of the head due to skin color. Despite knowing about the difficulties with very light or very dark skin tones, officials decided the system worked well enough before it went live in 2016. The system was also found to be 'systemically racist' in rejecting photos of black individuals more frequently than white individuals [111916].
(b) The software failure incident can also be attributed to accidental factors. Users like Elaine Owusu and Joshua Bada faced errors in the photo checker system that mistook their features for open mouths or closed eyes. These users had to explain the system's mistakes, indicating that the errors were not intentional but rather accidental due to the system's inability to accurately process diverse facial features. The system's failure to properly detect features like lips and eyes was not deliberate but a result of the system's limitations [111916]. |
Duration |
permanent, temporary |
The software failure incident related to the UK government's online passport photo checker can be categorized as both permanent and temporary.
(a) Permanent: The software failure incident can be considered permanent as the facial detection system has been shown to have 'racist' mistakes and issues with very light or very dark skin tones since it went live in 2016 [111916]. Despite an updated version being available for more than a year, the improved software has still not been deployed, indicating a long-standing issue with the system.
(b) Temporary: On the other hand, the delay in rolling out the updated software, which was likely affected by the Covid-19 pandemic, suggests a temporary aspect to the failure incident [111916]. The delay in deploying the updated software indicates that the failure may have been exacerbated by specific circumstances such as the pandemic, leading to a temporary setback in resolving the issues with the system. |
Behaviour |
omission, value, other |
(a) crash: The software failure incident in the articles does not involve a crash where the system loses state and does not perform any of its intended functions [111916].
(b) omission: The software failure incident involves omission where the system omits to perform its intended functions at instances. For example, users of color received messages such as 'it looks like your mouth is open' or 'we can't find the outline of your head' due to their skin color, indicating the system's omission to accurately detect facial features [111916].
(c) timing: The software failure incident does not involve timing issues where the system performs its intended functions correctly but too late or too early [111916].
(d) value: The software failure incident involves a failure related to the system performing its intended functions incorrectly. For instance, the facial recognition software used to check passports was found to be 'systemically racist,' with a study showing it was more likely to reject photos of individuals with darker skin tones [111916].
(e) byzantine: The software failure incident does not exhibit a byzantine behavior where the system behaves erroneously with inconsistent responses and interactions [111916].
(f) other: The software failure incident involves a behavior where the system, despite being aware of difficulties with very light or very dark skin tones, decided to deploy the technology anyway, leading to issues for users of color. This behavior could be categorized as a decision-making failure or a failure in addressing known biases [111916]. |