Published Date: 2021-10-05
Postmortem Analysis | |
---|---|
Timeline | 1. The software failure incident involving an Uber driver who lost his job due to automated face-scanning software failing to recognize him happened in April 2021 as mentioned in Article 119712. 2. The software failure incident involving an Uber Eats driver who was permanently suspended for 'sharing' his account after failing Face API occurred in April 2021 as well, according to Article 129924. 3. The software failure incident where an Uber Eats driver's account was illegally deactivated due to facial-verification software wrongly deciding his selfie pictures were of someone else on several occasions took place in May 2021, as per Article 119714. |
System | 1. Facial recognition software used by Uber, specifically the Face API from Microsoft [119712, 129924, 119714] |
Responsible Organization | 1. Uber [119712, 129924, 119714] 2. Microsoft [119712, 129924, 119714] |
Impacted Organization | 1. Uber drivers [119712, 129924, 119714] 2. Uber Eats drivers [129924, 119714] |
Software Causes | 1. The failure incident was caused by the automated facial recognition software used by Uber, specifically the Face API from Microsoft, which wrongly decided the selfies of drivers were of someone else, leading to their accounts being deactivated [119712, 129924, 119714]. 2. The software's racial bias and discriminatory impact on individuals from ethnic minority groups, particularly people of color, contributed to the failure incident [119712, 129924, 119714]. 3. The software's inability to effectively recognize and verify individuals with darker skin tones resulted in higher error rates and false positives, leading to wrongful deactivations of drivers' accounts [119712, 129924, 119714]. |
Non-software Causes | 1. Lack of meaningful human review in the facial recognition process [129924] 2. Failure to investigate the potential discriminatory effect of the automated software [119714] 3. Lack of clarity surrounding the reasons for the deactivation process [129924] |
Impacts | 1. Several drivers had their accounts illegally deactivated due to mistaken identity by the facial recognition software, leading to loss of income and financial hardship for affected drivers [119712, 129924, 119714]. 2. The affected drivers faced difficulties in accessing their accounts and were locked out of the system, impacting their ability to work and provide for their families [119712, 129924]. 3. The software failure incident resulted in accusations of racial discrimination and bias, particularly affecting drivers of color, leading to claims of racism in the technology used by Uber [119712, 129924, 119714]. 4. The incident highlighted the potential discriminatory impact of facial recognition software on ethnic minority groups, raising concerns about the fairness and accuracy of such technologies [119712, 129924, 119714]. 5. Drivers who were wrongly dismissed due to the malfunctioning face recognition technology faced challenges in seeking compensation and reinstatement, causing further distress and uncertainty [119714]. 6. The incident led to protests and strikes by affected drivers and their supporters, drawing attention to the issue of facial recognition technology and its implications for workers' rights and equality [119714]. |
Preventions | 1. Implementing more diverse and inclusive testing of the facial recognition software to ensure it works effectively for people of all skin tones [119712, 129924, 119714]. 2. Incorporating meaningful human review in the facial recognition process to detect and resolve cases of misidentification or other failures [119714]. 3. Conducting thorough investigations and providing clear reasons for any deactivation decisions made based on the software's results [119714]. 4. Engaging in open dialogue with affected individuals to address any issues or concerns related to the software's performance [129924, 119714]. 5. Offering compensation or support to individuals who have been wrongly affected by the software failure [119714]. |
Fixes | 1. Implementing meaningful human review in the facial recognition software to detect and resolve cases of misidentification or other failures [119714]. 2. Incorporating improvements in the facial recognition software to reduce error rates when recognizing people with darker skin [119712, 119714]. 3. Conducting thorough investigations and providing clear reasons for any deactivation decisions made by the software [129924]. 4. Allowing drivers to appeal against decisions made by the software and providing an additional human review process for verification [119712, 119714]. 5. Scrapping the current facial recognition algorithm and considering alternative methods for verifying driver identities to prevent discriminatory impacts [119712, 119714]. | References | 1. Uber 2. Independent Workers’ Union of Great Britain (IWGB) 3. Microsoft 4. Transport for London (TfL) 5. Equal Opportunities Commission 6. App Drivers & Couriers Union 7. Black Lives Matter organization |
Category | Option | Rationale |
---|---|---|
Recurring | one_organization | (a) The software failure incident having happened again at one_organization: - The incident of facial recognition software wrongly identifying individuals of color has occurred multiple times within Uber. Several drivers, including Pa Edrissa Manjang and Imran Javaid Raja, have had their accounts deactivated due to mismatches between their selfie pictures and the ones on their Uber profiles [Article 119712, Article 129924, Article 119714]. (b) The software failure incident having happened again at multiple_organization: - The articles do not mention the same incident happening at other organizations or with their products and services. |
Phase (Design/Operation) | design, operation | (a) The software failure incident related to the design phase: - The incident involving Uber drivers being falsely dismissed due to malfunctioning face recognition technology was attributed to an automated system introduced by Uber to check the ID of drivers in April last year [Article 119714]. - The facial-verification software used by Uber to log drivers onto the ride-hailing app failed to recognize a black driver, leading to his account being illegally deactivated [Article 119712]. - Microsoft, the maker of the facial recognition software used by Uber, acknowledged in 2019 that facial recognition software did not work as well for people of color and could fail to recognize them [Article 119712]. (b) The software failure incident related to the operation phase: - Uber drivers faced issues with the facial recognition software during their daily operations, where they had to take selfies to verify their identity for work [Article 129924]. - Drivers like Pa Manjang were suspended or permanently deactivated from the Uber platform due to the facial recognition software wrongly deciding their selfies were of someone else, impacting their ability to operate and earn a living [Article 129924]. - The drivers affected by the software failure incidents had to deal with the consequences of being locked out of the system, facing financial difficulties, and experiencing challenges in getting their accounts reinstated [Article 119712, Article 129924]. |
Boundary (Internal/External) | within_system, outside_system | (a) within_system: - The software failure incident involving the facial recognition technology used by Uber for driver verification was primarily within the system. The incident occurred when the automated face-scanning software wrongly identified drivers' selfies as someone else, leading to their accounts being deactivated [119712, 129924, 119714]. - Uber introduced an automated system to check drivers' IDs, where they had to take a selfie that was compared to their profile picture. However, the software made mistakes in recognizing the drivers, resulting in account deactivations [119714]. - Drivers like Pa Manjang and Imran Javaid Raja faced account deactivation due to continued mismatches between their selfie pictures and the ones on their Uber profiles, indicating an issue within the system [119714]. - The software failure incident led to drivers being falsely dismissed, causing financial hardships and emotional distress [119712, 129924, 119714]. (b) outside_system: - The contributing factors that originated from outside the system in this software failure incident are related to the racial bias and discriminatory impact of the facial recognition technology used by Uber. The software's failure to recognize individuals of color points to external factors influencing the incident [119712, 129924, 119714]. - The incident highlights the challenges faced by ethnic minority groups due to false positive and false negative results in facial recognition technology, indicating external factors affecting the system's performance [129924]. - The racial discrimination claims against Uber and the software's inability to effectively verify individuals of color suggest external influences on the software's functionality [119712, 129924, 119714]. |
Nature (Human/Non-human) | non-human_actions, human_actions | (a) The software failure incident occurring due to non-human actions: - The articles describe incidents where automated face-scanning software used by Uber failed to recognize drivers of color, leading to their accounts being deactivated [119712, 129924, 119714]. - The facial recognition software, specifically the Face API from Microsoft, was mentioned to be the technology behind the automated system used by Uber to verify driver identities [129924]. - The software was reported to have wrongly decided that the drivers' selfies were of someone else, leading to their suspension or deactivation from the platform [119712, 129924, 119714]. - Studies have shown that facial recognition software can have higher error rates when recognizing people with darker skin tones compared to lighter-skinned individuals [119712, 119714]. - The software failure incidents were attributed to the discriminatory impact of the facial recognition process on individuals from ethnic minority groups [129924]. (b) The software failure incident occurring due to human actions: - Human actions were involved in the decision-making process following the software's failure. For example, in one case, a driver suggested that a human review the photos, but the account was still deactivated after "careful consideration" by Uber [119714]. - The drivers affected by the software failure incidents raised concerns about the lack of investigation into the potential discriminatory effect of the automated software and the failure to allow a human to review the photos [119714]. - Uber was criticized for not adequately addressing the issues raised by the drivers and for not providing clear explanations or opportunities for human intervention in the decision-making process [119714]. - The drivers affected by the software failures took legal action against Uber, alleging race discrimination and wrongful deactivation of their accounts [119712, 129924, 119714]. |
Dimension (Hardware/Software) | software | (a) The articles do not provide information about the software failure incident occurring due to hardware-related factors. (b) The software failure incidents reported in the articles are primarily related to the facial recognition software used by Uber and Uber Eats. The incidents involve the automated face-scanning software failing to recognize drivers of color, leading to their accounts being deactivated or suspended. The drivers allege that the software wrongly decided their selfies were of someone else, leading to their dismissal. The software is described as racially biased and discriminatory, with claims of false positives and false negatives being greater in individuals from ethnic minority groups. The incidents highlight issues with the facial recognition technology and its impact on drivers of color, leading to legal claims of race discrimination and harassment against Uber and Uber Eats. The software failure incidents are attributed to contributing factors originating in the software itself, specifically the facial recognition algorithms used by the companies ([119712], [129924], [119714]). |
Objective (Malicious/Non-malicious) | non-malicious | (a) The software failure incident related to the facial recognition technology used by Uber and Uber Eats appears to fall under the non-malicious category. The incidents described in the articles involve automated face-scanning software failing to recognize drivers of color, leading to their accounts being deactivated or suspended. The drivers affected by the software failure incidents have raised concerns about racial bias in the facial recognition technology, with claims of false positives and false negatives being greater in individuals from ethnic minority groups. The drivers have alleged that the software wrongly decided their selfies were of someone else and that it is racially biased [119712, 129924, 119714]. Uber and Uber Eats have defended the use of the facial recognition technology as a measure to protect the safety and security of users, stating that there are human expert reviews prior to any decision to remove a driver. However, the drivers affected have criticized the lack of transparency in the deactivation process and the failure of the companies to address the potential discriminatory impact of the automated software. The drivers have also highlighted the challenges they faced in appealing against the decisions made by the software, indicating a lack of malicious intent behind the software failure incidents [119712, 129924, 119714]. |
Intent (Poor/Accidental Decisions) | poor_decisions | (a) The intent of the software failure incident: - The software failure incident related to the facial recognition software used by Uber for driver verification was due to poor decisions made in implementing the automated system. The system wrongly decided drivers' selfies were of someone else on multiple occasions, leading to account deactivations and dismissals [119712, 129924, 119714]. - Uber introduced an automated system to check the ID of drivers, requiring them to take a selfie picture that is then compared to one on their Uber account profile. This system led to drivers being falsely dismissed because of malfunctioning face recognition technology, indicating poor decisions in the implementation of the software [119714]. |
Capability (Incompetence/Accidental) | development_incompetence | (a) The software failure incident occurring due to development_incompetence: - The incident involving Uber drivers being falsely dismissed due to malfunctioning face recognition technology was attributed to the automated facial-verification software wrongly deciding on several occasions that the drivers' selfie pictures were of someone else [119714]. - Uber introduced an automated system to check the ID of drivers, where each time a driver checks in for work, they must take a selfie picture that is then compared to one on their Uber account profile. However, this system led to wrongful deactivations of drivers' accounts due to continued mismatches between the pictures taken to register for a shift and the one on their Uber work profile [119714]. (b) The software failure incident occurring due to accidental factors: - The incident involving Uber drivers facing issues with facial recognition software was described as the software making mistakes and wrongly deciding that the drivers' selfies were of someone else, leading to their accounts being deactivated [119714]. - The drivers affected by the facial recognition software failures claimed that the software was racially biased and placed ethnic minority groups at a disadvantage, indicating that the failures were not intentional but rather a result of the software's inherent biases [129924]. |
Duration | permanent, temporary | The software failure incident related to the facial recognition technology used by Uber for driver verification can be categorized as both permanent and temporary. Permanent: - The incident led to permanent deactivation of accounts for drivers who failed the facial recognition checks, with decisions being made on a permanent basis without further review [119712]. - One driver received a message stating that the decision to end the partnership had been made on a permanent basis [119712]. - Another driver was permanently suspended for 'sharing' his account after failing the Face API multiple times [129924]. - A driver was dismissed for "continued mismatches" between the pictures he took to register for a shift and the one on his Uber work profile [119714]. Temporary: - Drivers who failed the facial recognition checks were initially locked out of the system for 24 hours before further actions were taken [119712]. - One driver was reinstated six months later after legal action was taken [129924]. - Another driver was reappointed the following month after being dismissed, with Uber accepting they had made a mistake [119714]. |
Behaviour | value, other | (a) crash: Failure due to system losing state and not performing any of its intended functions: - The articles do not mention any instances of the software crashing and losing state, resulting in a complete failure of its intended functions. (b) omission: Failure due to system omitting to perform its intended functions at an instance(s): - The software failure incident described in the articles does not directly relate to the system omitting to perform its intended functions at a specific instance. (c) timing: Failure due to system performing its intended functions correctly, but too late or too early: - The incident does not involve the system performing its intended functions correctly but at incorrect times. (d) value: Failure due to system performing its intended functions incorrectly: - The software failure incident primarily revolves around the system performing its intended function of facial recognition incorrectly, leading to the wrongful deactivation of drivers' accounts [119712, 129924, 119714]. (e) byzantine: Failure due to system behaving erroneously with inconsistent responses and interactions: - The software failure incident does not exhibit characteristics of a byzantine failure where the system behaves inconsistently with varying responses and interactions. (f) other: Failure due to system behaving in a way not described in the (a to e) options; What is the other behaviour? - The other behavior observed in this software failure incident is the discriminatory impact of the facial recognition software on individuals from ethnic minority groups, particularly people of color. The software's failure to accurately recognize faces of individuals with darker skin tones led to wrongful deactivations and dismissals of drivers, highlighting a significant flaw in the system's design and implementation [119712, 129924, 119714]. |
Layer | Option | Rationale |
---|---|---|
Perception | None | None |
Communication | None | None |
Application | None | None |
Category | Option | Rationale |
---|---|---|
Consequence | basic, property, delay, other | (a) death: People lost their lives due to the software failure - There is no mention of any deaths resulting from the software failure incident in the provided articles [119712, 129924, 119714]. (b) harm: People were physically harmed due to the software failure - There is no mention of physical harm to individuals due to the software failure incident in the provided articles [119712, 129924, 119714]. (c) basic: People's access to food or shelter was impacted because of the software failure - The software failure incident did impact the livelihood and access to income for the drivers affected, potentially leading to financial difficulties and hardships [119712, 129924, 119714]. (d) property: People's material goods, money, or data was impacted due to the software failure - The software failure incident led to drivers losing their jobs or being locked out of the system, impacting their ability to earn income [119712, 129924, 119714]. (e) delay: People had to postpone an activity due to the software failure - The software failure incident did cause delays in drivers being able to work and access the Uber platform due to being locked out or deactivated [119712, 129924, 119714]. (f) non-human: Non-human entities were impacted due to the software failure - The software failure incident primarily affected human drivers who were using the Uber platform, and there is no mention of non-human entities being impacted [119712, 129924, 119714]. (g) no_consequence: There were no real observed consequences of the software failure - The software failure incident did have real consequences on the drivers, including job loss, financial difficulties, and delays in accessing the platform [119712, 129924, 119714]. (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur - The articles do not discuss potential consequences of the software failure that did not actually occur [119712, 129924, 119714]. (i) other: Was there consequence(s) of the software failure not described in the (a to h) options? What is the other consequence(s)? - The software failure incident resulted in drivers facing financial hardships, being locked out of the system, losing their jobs, and experiencing delays in accessing the platform [119712, 129924, 119714]. |
Domain | transportation, health, other | (a) The failed system was intended to support the transportation industry, specifically the ride-hailing services provided by Uber. The facial recognition software was used by Uber to verify the identity of drivers logging onto the platform [119712], [129924], [119714]. (j) The failed system was also related to the health industry indirectly as it impacted the livelihoods of drivers who worked for Uber Eats, a food delivery service under Uber. The drivers faced financial difficulties and stress due to being locked out of the system by the facial recognition software [129924], [119714]. (m) The failed system could be categorized under "other" as it was a facial recognition software system used by Uber for identity verification purposes in the gig economy, which does not fall directly under the traditional industry categories mentioned [119712], [129924], [119714]. |
Article ID: 119712
Article ID: 129924
Article ID: 119714