Published Date: 2018-03-19
Postmortem Analysis | |
---|---|
Timeline | 1. The software failure incident involving a fatal crash with an Uber self-driving car in Tempe, Arizona, likely caused by a software problem, occurred in March 2018 as reported in [Article 71277]. 2. The incident where an Uber self-driving test vehicle hit and killed a woman in 2018 due to software problems happened in March 2018, as mentioned in [Article 91874]. 3. The National Transportation Safety Board released documents related to the 2018 crash involving a self-driving Uber that killed a pedestrian, indicating glaring safety holes in the software, as reported in [Article 91718]. 4. The NTSB report on the fatal crash involving an Uber self-driving car in 2018 highlighted software failures and inadequate safety culture at Uber, indicating the incident occurred in 2018, as mentioned in [Article 104825]. 5. The NTSB investigation into the fatal crash involving an Uber self-driving car in 2018 pointed out the failure of the operator to monitor the surroundings and the automated system, indicating the incident occurred in 2018, as reported in [Article 105692]. |
System | 1. Uber's software system [68919, 69073, 69754, 71142, 71277, 71698, 91718, 91719, 91891, 91977, 104825, 105189, 105692] 2. Lidar laser sensor system [68919, 69754] 3. Volvo XC90's automatic emergency braking system [91977, 105189, 105692] |
Responsible Organization | 1. Uber's software system [71277] 2. Uber's inadequate safety culture [91718, 91719, 91977, 104825, 105189] 3. The safety driver's distraction and failure to monitor the driving environment [105692] |
Impacted Organization | 1. Uber [69011, 69073, 69097, 71277, 71346, 91874, 91891, 91977, 104825] 2. National Transportation Safety Board (NTSB) [69073, 69754, 71142, 91718, 91977, 104825] |
Software Causes | 1. The software was tuned to ignore "false positives" like plastic bags or pieces of paper, causing a delay in reacting to the pedestrian crossing the street [Article 71277]. 2. Uber's software was not equipped to identify or deal with pedestrians walking outside of a crosswalk, leading to a delay in reaction [Article 91719]. 3. The software failed to properly identify the pedestrian as a pedestrian and did not address "operators’ automation complacency" [Article 91977]. 4. Uber's system could not correctly classify and predict the path of a pedestrian crossing midblock, contributing to the incident [Article 105189]. |
Non-software Causes | 1. Lack of appropriate oversight for vehicle operators [91778] 2. Failure of the operator to monitor their surroundings due to distraction [104825, 105189, 105692] |
Impacts | 1. The software problem likely caused a fatal accident involving one of Uber's self-driving cars in Tempe, Arizona, leading to the death of a pedestrian [Article 71277]. 2. Uber's software did not properly identify the pedestrian and failed to predict their movement into the vehicle's path, resulting in the fatal crash [Article 91891]. 3. The NTSB investigation concluded that the crash was caused by the failure of the operator to monitor the surroundings due to being visually distracted by a personal cell phone, highlighting the impact of human-machine interaction in autonomous systems [Article 104825]. 4. Uber's inadequate safety culture, including the lack of an operational safety division or safety manager, contributed to the crash, as the software was not equipped to identify or deal with pedestrians outside of crosswalks and had safety holes [Article 91718, Article 91719]. 5. The NTSB found that Uber's system could not correctly classify and predict the path of a pedestrian crossing midblock, further emphasizing the software's limitations in handling real-world scenarios [Article 105189]. |
Preventions | 1. Ensuring the safety driver remains attentive and ready to intervene at all times during autonomous vehicle testing [68919, 69073, 71698, 91778, 104825, 105692]. 2. Implementing a system to monitor the driver's attentiveness and alert them if they are not focused on the road [71698]. 3. Properly identifying pedestrians and potential collision risks in the software to enable timely reactions [71277, 91891, 91977]. 4. Conducting thorough safety risk assessments and addressing safety concerns raised by employees [104825]. 5. Avoiding complacency in automation and ensuring operators are adequately trained and prepared to take control if needed [91977, 105692]. |
Fixes | 1. Improving safety risk assessment procedures and oversight of vehicle operators [Article 104825]. 2. Addressing the software's inability to properly identify pedestrians and assess safety risks [Article 91977, Article 105692]. 3. Implementing critical program improvements to prioritize safety, including handling scenarios like jaywalking and ensuring human drivers can intervene when necessary [Article 91891]. 4. Enhancing the software's ability to react promptly to detected objects and avoiding tuning that leads to ignoring potential hazards [Article 71277]. 5. Ensuring that the software is constantly updated and improved, with incidents logged and checked by engineers [Article 69754]. 6. Reevaluating the training processes for vehicle operators and ensuring they are equipped to intervene effectively [Article 69073]. 7. Implementing two-person testing teams to enhance monitoring and note-taking during drives [Article 71698]. | References | 1. National Transportation Safety Board (NTSB) [Article 68919, Article 71142, Article 91891, Article 91977, Article 104825] 2. Uber [Article 68919, Article 69754, Article 71277, Article 91874, Article 104825] 3. The Information [Article 71277] 4. Experts in autonomous vehicle law [Article 105189] 5. Various individuals involved in the incident such as engineers, professors, former Uber employees, and lawyers [Article 68919, Article 69073, Article 69754, Article 71277, Article 91874] 6. Investigators [Article 91718, Article 91719] 7. Tempe police [Article 69754, Article 71277] 8. US Department of Transportation's National Highway Traffic Safety Administration [Article 71277] 9. Boston lawyer Matt Henshon [Article 69073] 10. California Institute of Technology engineering professor Richard Murray [Article 69754] 11. Salvador Rodriguez in San Francisco and Eric Johnson in Seattle [Article 69754] 12. David Shepardson [Article 105692] |
Category | Option | Rationale |
---|---|---|
Recurring | one_organization | (a) The software failure incident having happened again at one_organization: - The incident involving the Uber self-driving car in Tempe, Arizona in March was caused by a software problem that led to the car not reacting fast enough to a pedestrian crossing the street [Article 71277]. - The software failure incident was related to the car's sensors detecting the pedestrian but the software determining that it didn't need to react immediately, leading to the fatal accident [Article 71277]. - The software was tuned to ignore certain objects considered "false positives," which resulted in a delayed reaction to the pedestrian [Article 71277]. - The software's inability to properly identify the pedestrian and predict her movements was a major flaw in the system [Article 91891]. (b) The software failure incident having happened again at multiple_organization: - There is no specific mention in the articles of a similar software failure incident happening at other organizations or with their products and services. |
Phase (Design/Operation) | design, operation | (a) The software failure incident related to the design phase: - The software problem likely caused a fatal accident involving one of Uber's self-driving cars in Tempe, Arizona, in March. The software was meant to determine how the car should react to detected objects, but it determined that it didn't need to immediately react due to how it was tuned [Article 71277]. - Uber's system was not equipped to identify or deal with pedestrians walking outside of a crosswalk. Engineers built in an automated one-second delay between crash detection and action due to concerns about false alarms [Article 91719]. (b) The software failure incident related to the operation phase: - The test driver behind the wheel of Uber's self-driving car was supposed to intervene if the autonomous driving software failed. However, the driver was visually distracted throughout the trip by her personal cell phone, spending 34% of the time looking at it while streaming a TV show. This distraction led to the failure to monitor the driving environment, a critical operation responsibility [Article 104825]. - The safety driver behind the wheel of the crashed Uber was not watching the road in the moments leading up to the collision, indicating a failure in the operation phase where the driver was ill-equipped to prevent the crash [Article 71698]. |
Boundary (Internal/External) | within_system, outside_system | (a) within_system: - The software failure incident involving the self-driving Uber car that hit a pedestrian highlighted issues within the system itself. The software was not equipped to identify or deal with pedestrians outside of crosswalks, had safety holes, and lacked the ability to predict pedestrian movements accurately [91718, 91719]. - The preliminary report mentioned that the self-driving system software classified the pedestrian as an unknown object, a vehicle, and then as a bicycle, indicating internal software classification issues [71346]. - The software failed to properly identify the pedestrian and did not address operators' automation complacency, contributing to the crash [91977]. - The software was tuned to react less to certain objects, potentially causing a delayed reaction when a pedestrian crossed the street [71277]. (b) outside_system: - The safety driver of the Uber vehicle was visually distracted by her personal cell phone, indicating an external factor contributing to the failure incident [91977, 104825, 105692]. - The NTSB identified the failure of the operator to monitor the surroundings as a probable cause of the accident, suggesting an external factor affecting the incident [104825]. - The NTSB report highlighted that the safety driver was visually distracted throughout the trip by her personal cell phone, indicating an external distraction that impacted the incident [105692]. |
Nature (Human/Non-human) | non-human_actions, human_actions | (a) The software failure incident occurring due to non-human actions: - The software problem likely caused the fatal accident involving one of Uber's self-driving cars in Tempe, Arizona. The software was meant to determine how the car should react to detected objects, but it reportedly didn't react fast enough due to how it was tuned [Article 71277]. - Uber's system was not equipped to identify or deal with pedestrians walking outside of a crosswalk. The software had glaring safety holes, such as an automated one-second delay between crash detection and action, and a delay in recognizing the need to brake before the impact [Article 91718, Article 91719]. - The software in the Uber car did not properly identify the pedestrian and failed to predict her movement into the vehicle's path, per the NTSB report [Article 91891]. (b) The software failure incident occurring due to human actions: - The distracted safety operator in the Uber self-driving car was primarily blamed for the fatal crash in 2018, according to the NTSB. An "inadequate safety culture" at Uber was also identified as a major contributing factor [Article 91873]. - The NTSB found that the failure of the back-up safety driver to monitor the driving environment, due to being visually distracted by her personal cell phone, was a probable cause of the crash. Uber made development decisions that contributed to the crash, such as deactivating the Volvo XC90's automatic emergency braking systems and relying on the back-up driver instead [Article 91977, Article 104825, Article 105692]. |
Dimension (Hardware/Software) | hardware, software | (a) The software failure incident occurring due to hardware: - Article 71277 mentions that a software problem likely caused a fatal accident involving one of Uber's self-driving cars in Tempe, Arizona. The software was meant to determine how the car should react to detected objects, and although the car's sensors detected the pedestrian, the software determined that it didn't need to react immediately due to how it was tuned. This indicates a failure in the software's interaction with the hardware sensors [71277]. (b) The software failure incident occurring due to software: - Article 71277 also highlights that Uber's software was tuned in a way that made it react less to certain objects, leading to a delayed reaction when a pedestrian crossed the street. This points to a software-related issue in how the system was programmed to respond to detected objects [71277]. - Additionally, Article 71962 mentions that although the car's sensors detected the pedestrian, the software responsible for deciding how the car should react was tuned too far in favor of ignoring objects in its path, leading to a delayed reaction. This further emphasizes a software-related failure in the incident [71962]. |
Objective (Malicious/Non-malicious) | non-malicious | (a) The software failure incident related to the Uber self-driving car crash in Tempe, Arizona, was non-malicious. The incident was attributed to software problems in the self-driving system that caused the car to not react appropriately to a pedestrian crossing the street [Article 71277]. The software was tuned to ignore certain objects as "false positives," leading to a delayed reaction when the pedestrian was detected [Article 71277]. Additionally, the software was not equipped to identify pedestrians outside of crosswalks, and there were safety holes in the software that did not sufficiently account for human errors [Article 91718, Article 91719]. The software also failed to properly identify the pedestrian and predict her movements, contributing to the crash [Article 91891]. (b) The incident also involved human factors, such as the safety driver being visually distracted by her personal cell phone during the trip, which was a contributing factor to the crash [Article 91977, Article 104825, Article 105692]. The safety driver was supposed to intervene in case of an emergency but failed to do so due to distraction [Article 91977, Article 104825, Article 105692]. The NTSB highlighted Uber's inadequate safety risk assessment procedures and ineffective oversight of vehicle operators as contributing factors to the incident [Article 104825]. |
Intent (Poor/Accidental Decisions) | poor_decisions, accidental_decisions | (a) The intent of the software failure incident related to poor decisions: - The National Transportation Safety Board (NTSB) criticized a series of decisions by Uber that were the result of "ineffective safety culture" at the time, contributing to the crash's cause [91977]. - Uber's system was not equipped to identify or deal with pedestrians walking outside of a crosswalk, and there were glaring safety holes in the software [91718, 91719]. - Uber made a series of development decisions that contributed to the crash's cause, such as deactivating the Volvo XC90's automatic emergency braking systems in the test vehicle and relying on the back-up driver instead of immediate emergency braking [91977]. (b) The intent of the software failure incident related to accidental decisions: - The incident involving the Uber self-driving car in Arizona was primarily attributed to a distracted safety operator and an "inadequate safety culture" at Uber [91873, 105189]. - The NTSB investigation concluded that the crash was caused because the safety driver was distracted by her phone, and Uber's inadequate safety culture contributed to the crash [105189]. - The software in the Uber car reportedly did not react fast enough to a pedestrian crossing the street due to how it was tuned to ignore certain objects, leading to the fatal accident [71277, 71962]. |
Capability (Incompetence/Accidental) | development_incompetence, accidental | (a) The software failure incident related to development incompetence is evident in the case of the Uber self-driving car crash in Tempe, Arizona. The incident was primarily caused by a software problem in Uber's self-driving car, where the software failed to react to a detected pedestrian, leading to the fatal accident [Article 71277]. Additionally, the National Transportation Safety Board (NTSB) highlighted that Uber's software included glaring safety holes and was not equipped to identify or deal with pedestrians outside of crosswalks, indicating a lack of professional competence in software development [Article 91718, Article 91719]. (b) The software failure incident related to accidental factors includes the case where the Uber self-driving car's software was tuned to ignore objects in its path that might be "false positives," such as plastic bags, leading to a delayed reaction in avoiding the pedestrian [Article 71962]. The NTSB also mentioned that the safety operator in the Uber self-driving car was distracted and primarily to blame for the crash, indicating an accidental factor contributing to the incident [Article 91873]. |
Duration | permanent, temporary | (a) The software failure incident in the Uber self-driving car accident in Tempe, Arizona was more of a permanent failure. The incident was attributed to a series of contributing factors introduced by various circumstances, including the software's inability to properly identify the pedestrian, inadequate safety risk assessment procedures, ineffective oversight of vehicle operators, and the deactivation of emergency braking systems [Article 71277, Article 91977, Article 104825]. (b) The software failure incident could also be considered temporary to some extent as it was mentioned that the software was constantly being updated and improved, and incidents in the cars were logged and checked out by engineers, indicating a potential for fixing the issues and preventing similar incidents in the future [Article 69754]. |
Behaviour | crash, omission, timing, value, byzantine, other | (a) crash: The incident involved a crash where a self-driving Uber vehicle struck and killed a pedestrian. The software failed to prevent the crash, leading to the fatal outcome [Article 71142]. (b) omission: The software in the Uber self-driving car omitted to properly identify the pedestrian as a pedestrian and did not adequately assess safety risks, contributing to the crash [Article 91977]. (c) timing: The software in the Uber self-driving car reacted too late to the pedestrian crossing the street, as it reportedly didn't react fast enough when the pedestrian crossed the street [Article 71277]. (d) value: The software in the Uber self-driving car performed its intended functions incorrectly by not properly identifying the pedestrian and failing to predict her movements into the vehicle's path [Article 91891]. (e) byzantine: The software in the Uber self-driving car behaved erroneously with inconsistent responses and interactions, such as not being equipped to identify or deal with pedestrians walking outside of a crosswalk and having an automated delay between crash detection and action [Article 91719]. (f) other: The incident also raised questions about the software's design and rollout, potential negligence by developers, missed nonworking sensors, and the ultimate responsibility between the human driver and the car itself [Article 69073]. |
Layer | Option | Rationale |
---|---|---|
Perception | sensor, processing_unit, embedded_software | (a) sensor: The failure was related to the sensor system, particularly the lidar laser sensor, which failed to avoid hitting a pedestrian due to factors such as sensor positioning, sensor data processing, and software response [68919, 69754, 71962]. (b) actuator: There is no specific mention of the failure being related to the actuator in the articles provided. (c) processing_unit: The failure was related to the processing unit of the system, as the software did not properly identify the pedestrian, did not adequately assess safety risks, and did not address automation complacency, leading to the crash [91977, 105692]. (d) network_communication: There is no specific mention of the failure being related to network communication in the articles provided. (e) embedded_software: The failure was related to the embedded software, as it was not equipped to identify or deal with pedestrians outside of crosswalks, had a delay in crash detection and action, and failed to predict the pedestrian's movements [91719, 91891]. |
Communication | link_level, connectivity_level | [105189] The NTSB investigation concluded that the crash was caused by the failure of the operator to monitor the driving environment due to being distracted by her phone. This indicates a failure at the connectivity_level, as the distraction affected the monitoring of the system's environment. Additionally, the NTSB found that Uber's system could not correctly classify and predict the path of a pedestrian crossing midblock, which points to a failure at the link_level, as the software's inability to properly identify and predict the pedestrian's path contributed to the incident. |
Application | TRUE | The failure in the software incident related to the self-driving Uber car crash was indeed related to the application layer of the cyber physical system. The failure was attributed to software-related issues such as the system's inability to identify or deal with pedestrians outside of a crosswalk, the presence of safety holes in the software, and the software's failure to predict the movement of a pedestrian into the vehicle's path [Article 91718, Article 91719, Article 91891]. These software-related issues point towards failures introduced by bugs, errors, and incorrect usage at the application layer of the cyber physical system. |
Category | Option | Rationale |
---|---|---|
Consequence | death, harm, theoretical_consequence | (a) The consequence of the software failure incident related to death: - The fatal accident involving one of Uber's self-driving cars in Tempe, Arizona in March resulted in a pedestrian losing their life [Article 71277]. - The collision marked the first fatality attributed to a self-driving car [Article 71962]. - The National Transportation Safety Board released documents related to the 2018 crash in which a self-driving Uber killed a pedestrian [Article 91718]. - The NTSB voted that the probable cause of the crash was the failure of the back-up safety driver to monitor the driving environment, resulting in the death of the pedestrian [Article 91977]. (b) The consequence of the software failure incident related to harm: - The fatal crash raised questions about the safety of testing self-driving cars on public roads and whether companies are taking enough precautions to prevent harm to other drivers and pedestrians [Article 71142]. - The collision resulted in harm to the pedestrian struck by the self-driving car [Article 71962]. - The NTSB criticized Uber for ineffective safety culture and decisions that contributed to the crash's cause, resulting in harm [Article 91977]. (h) Theoretical consequences discussed but not occurred: - There were discussions about potential consequences of fatal crashes involving autonomous vehicles becoming more commonplace as testing is introduced and expanded [Article 69073]. - The articles mention reflections within the industry and calls for reform following the fatal collision, indicating potential consequences that may lead to changes in regulations and practices [Article 69011]. |
Domain | transportation | (a) The failed system was related to the transportation industry, specifically autonomous vehicles and self-driving technology. The incident involved a self-driving Uber vehicle that was involved in a fatal collision with a pedestrian [Article 68919], [Article 69011], [Article 69073], [Article 71142], [Article 71277], [Article 71698], [Article 91718], [Article 91719], [Article 91891], [Article 91977], [Article 104825]. |
Article ID: 91778
Article ID: 71698
Article ID: 71873
Article ID: 91718
Article ID: 91873
Article ID: 71346
Article ID: 91977
Article ID: 69011
Article ID: 68919
Article ID: 69097
Article ID: 71277
Article ID: 69073
Article ID: 91891
Article ID: 104825
Article ID: 92778
Article ID: 69754
Article ID: 71142
Article ID: 71962
Article ID: 105189
Article ID: 105692
Article ID: 91874
Article ID: 91719