Published Date: 2022-09-01
Postmortem Analysis | |
---|---|
Timeline | 1. The software failure incident involving Cruise's self-driving vehicles happened in June 2022 [130924, 131878, 131474]. |
System | 1. Cruise's self-driving software [Article 130924, Article 131878, Article 131474] |
Responsible Organization | 1. Cruise (General Motors subsidiary) [130924, 131878, 131474] 2. National Highway Traffic Safety Administration (NHTSA) [130924, 131878, 131474] |
Impacted Organization | 1. Cruise LLC [130924, 131878, 131474] 2. General Motors [130924, 131878, 131474] 3. National Highway Traffic Safety Administration (NHTSA) [130924, 131878, 131474] |
Software Causes | 1. The software in the autonomous driving vehicles incorrectly predicted the path of an oncoming vehicle, leading to a crash [130924, 131878, 131474]. 2. The software was not sufficiently reactive in certain circumstances, such as when making unprotected left turns, which could cause the autonomous driving system to incorrectly predict another vehicle's path or be insufficiently reactive to sudden path changes of road users [131474]. |
Non-software Causes | 1. The crash occurred when a Cruise vehicle attempting to make an unprotected left turn across a two-lane street was struck by a car that was traveling in the opposite direction and speeding in a turn lane [130924]. 2. The other vehicle continued straight through the intersection, T-boning the now stationary Cruise car [130924]. 3. Cruise acknowledged that its robotaxi was not "sufficiently reactive" [131878]. 4. The oncoming vehicle drove in the right-turn lane and was traveling at "approximately 40 mph" in a 25-mph lane before it exited the lane and proceeded forward [131878]. 5. The police report found the party most at fault for the June crash was the other vehicle, which was traveling at 40 miles per hour in a 25-mile zone [131474]. |
Impacts | 1. The software failure incident involving Cruise's self-driving vehicles led to a recall and software update on 80 vehicles after a crash in San Francisco that left two people injured [130924, 131878, 131474]. 2. Following the incident, Cruise temporarily prevented its vehicles from making unprotected left turns and reduced the area in which its vehicles could operate until a software update was implemented [130924, 131878, 131474]. 3. The software issue caused the autonomous vehicle to hard brake during an unprotected left turn, leading to a crash scenario where the vehicle had to choose between two risk scenarios [130924, 131474]. 4. The National Highway Traffic Safety Administration (NHTSA) required Cruise to address the safety defect in its automated driving systems software through a recall filing [131474]. 5. The software update aimed to improve the robotaxis' ability to predict other vehicles' actions and enhance safety measures to prevent similar incidents in the future [130924, 131878]. 6. The incident highlighted the challenges and complexities of developing and operating fully autonomous vehicles, with Cruise facing technical glitches and flaws in its robotaxis [131878]. 7. Cruise's financial losses in trying to build a robotaxi business in San Francisco were also impacted by the software failure incident, with GM disclosing significant financial losses related to Cruise operations [131474]. |
Preventions | 1. Improved software prediction capabilities to accurately anticipate the behavior of other vehicles on the road, especially in complex scenarios like making unprotected left turns [130924, 131878, 131474]. 2. Enhanced reactivity of the autonomous driving system to sudden changes in the path of other road users to avoid collisions [131474]. 3. Continuous monitoring and updating of the self-driving software to address potential safety issues and ensure the system's effectiveness [131878]. 4. Implementation of over-the-air software updates to quickly address and rectify any software errors or faults without the need for physical recalls [130924]. 5. Strict adherence to safety standards and regulations set by government agencies like the National Highway Traffic Safety Administration (NHTSA) to prioritize the safety of pedestrians, bicyclists, and other road users [130924]. |
Fixes | 1. Implementing a software update that improves the self-driving software's predictions, especially in situations like the one that led to the crash [130924, 131878, 131474]. 2. Conducting a recall and updating the software in the self-driving vehicles to address the safety defect in the automated driving systems software [131474]. 3. Ensuring that the software update corrects the issue where the autonomous driving system incorrectly predicts another vehicle's path or is insufficiently reactive to sudden path changes of road users [131474]. 4. Gradually reintroducing unprotected left turns after the software update to enhance the vehicles' operational capabilities [131474]. | References | 1. Cruise spokesperson Hannah Lindow [130924, 131878] 2. General Motors [130924, 131474] 3. National Highway Traffic Safety Administration (NHTSA) [130924, 131878, 131474] 4. San Francisco police department [131878] 5. California Department of Motor Vehicles [130924, 131474] 6. NHTSA head Steven Cliff [130924] 7. Jeff Bleich, chief legal officer at Cruise [130924] 8. GM CEO Mary Barra [131878] 9. San Francisco Fire Department [131878] 10. Bryant Walker Smith, a professor at the University of South Carolina law school [131878] 11. Reuters [131474] 12. CNN Business [131878] |
Category | Option | Rationale |
---|---|---|
Recurring | one_organization, multiple_organization | (a) The software failure incident having happened again at one_organization: - Cruise, the autonomous driving company owned by General Motors, recalled software deployed on 80 vehicles after a crash in San Francisco in June, where two people were injured due to a flawed software prediction leading to a collision [130924, 131878, 131474]. - Cruise acknowledged that the software issue led to the autonomous vehicle incorrectly predicting the path of an oncoming vehicle, causing the crash [130924, 131878, 131474]. - The company stated that the software update implemented in July addressed the issue, and if the vehicle involved in the crash had been running the updated software, the crash would not have occurred [130924, 131878, 131474]. (b) The software failure incident having happened again at multiple_organization: - Another self-driving developer, Pony.ai, had to recall three self-driving vehicles in March after a software error caused the system to shut down unexpectedly while the vehicles were in motion [130924]. - The increasing amount of software in vehicles has led to more recalls, even among human-driven cars, which can be accomplished through over-the-air updates [130924]. - The National Highway Traffic Safety Administration (NHTSA) has been proactive in addressing software-related issues, as seen in the recalls by Cruise and Pony.ai [130924, 131474]. |
Phase (Design/Operation) | design | (a) The software failure incident in the articles can be attributed to the design phase. The incident involving Cruise's self-driving vehicles in San Francisco was caused by a flaw in the software that incorrectly predicted the path of an oncoming vehicle, leading to a crash [130924, 131878, 131474]. This flaw was identified as a contributing factor introduced during the system development phase, specifically in the software design that impacted the vehicle's ability to react appropriately in certain scenarios, such as making unprotected left turns. The software update implemented by Cruise aimed to address this design flaw and improve the vehicle's predictive capabilities to prevent similar incidents in the future. |
Boundary (Internal/External) | within_system, outside_system | (a) within_system: The software failure incident involving Cruise's self-driving vehicles in San Francisco was primarily attributed to a flaw within the system's software. The incident occurred when the autonomous vehicle incorrectly predicted the path of an oncoming vehicle while attempting to make an unprotected left turn, leading to a collision [130924, 131878, 131474]. Cruise acknowledged that the software was not sufficiently reactive in this scenario and issued a recall to address the safety defect in the automated driving system software [131474]. The company updated the software to improve the vehicles' ability to predict other vehicles' actions, especially in situations similar to the crash [131878]. (b) outside_system: The software failure incident was also influenced by factors outside the system, such as the behavior of the other vehicle involved in the crash. The oncoming vehicle was reported to be traveling at a higher speed than the posted limit and unexpectedly changed its path, contributing to the collision with the Cruise vehicle [130924, 131878, 131474]. Additionally, regulatory bodies like the National Highway Traffic Safety Administration (NHTSA) played a role in overseeing the recall and ensuring that manufacturers prioritize the safety of road users [130924, 131878, 131474]. |
Nature (Human/Non-human) | non-human_actions, human_actions | (a) The software failure incident occurring due to non-human actions: - The software failure incident involving Cruise's autonomous vehicles in San Francisco was due to flawed software that incorrectly predicted the path of an oncoming vehicle, leading to a crash [130924, 131878, 131474]. - Cruise's software update aimed to improve the self-driving software's predictions, especially in situations like the one that led to the crash [130924, 131878]. - The recalled software in Cruise's self-driving vehicles could incorrectly predict another vehicle's path or be insufficiently reactive to sudden path changes of road users, leading to the need for a recall and software update [131474]. (b) The software failure incident occurring due to human actions: - The crash involving Cruise's autonomous vehicles occurred when the Cruise vehicle attempting to make an unprotected left turn was struck by a speeding car traveling in the opposite direction, which did not turn as predicted by Cruise's software [130924, 131878]. - Cruise acknowledged that its robotaxi was not "sufficiently reactive" in the situation that led to the crash, indicating a potential software flaw introduced by human actions in the design or implementation of the software [131878]. - Cruise disclosed that in rare circumstances, the software caused the autonomous vehicle to hard brake while performing an unprotected left turn, indicating a decision-making process within the software that led to the incident [131474]. |
Dimension (Hardware/Software) | software | (a) The software failure incident occurring due to hardware: - There is no specific mention of the software failure incident in the provided articles being attributed to hardware issues. Therefore, it is unknown if the incident was caused by hardware-related factors. (b) The software failure incident occurring due to software: - The software failure incident reported in the articles is attributed to software issues. Cruise recalled and updated software deployed on 80 vehicles after a crash in San Francisco, where the software incorrectly predicted an oncoming vehicle's path, leading to a collision [130924, 131878, 131474]. The software update aimed to improve the robotaxis' ability to predict other vehicles' actions and address safety concerns related to software performance. |
Objective (Malicious/Non-malicious) | non-malicious | (a) The software failure incident described in the articles is non-malicious. The incident involved a crash in San Francisco where a Cruise autonomous vehicle incorrectly predicted the path of an oncoming vehicle, leading to a collision. The software flaw caused the autonomous driving system to incorrectly predict the path of another vehicle or be insufficiently reactive to sudden path changes, particularly during unprotected left turns [Article 130924, Article 131878, Article 131474]. The recall and software update conducted by Cruise after the incident aimed to address this safety defect in the automated driving system software, indicating that the failure was unintentional and not caused by malicious intent [Article 131474]. |
Intent (Poor/Accidental Decisions) | poor_decisions | (a) poor_decisions: The software failure incident involving Cruise's autonomous vehicles in San Francisco was primarily due to poor decisions made by the software. The software incorrectly predicted the path of an oncoming vehicle, leading to a crash. Cruise's software had predicted that the other car would turn right, but it continued straight through the intersection, resulting in a collision [130924, 131878]. Additionally, the software caused the autonomous vehicle to hard brake while making an unprotected left turn, which ultimately led to the crash. Cruise mentioned that the software had to decide between two different risk scenarios and chose the one with the least potential for a serious collision at the time, before the oncoming vehicle's sudden change of direction [131474]. These poor decisions made by the software contributed to the incident. (b) accidental_decisions: The software failure incident does not seem to be primarily attributed to accidental decisions or unintended mistakes in the articles provided. |
Capability (Incompetence/Accidental) | development_incompetence, accidental | (a) The software failure incident related to development incompetence is evident in the articles. Cruise, a self-driving company owned by General Motors, had to recall and update software in 80 self-driving vehicles after a crash in San Francisco that left two people injured. The recalled software was found to "incorrectly predict" an oncoming vehicle's path, leading to the crash. Cruise acknowledged that its robotaxi was not "sufficiently reactive" and that the software caused the autonomous vehicle to hard brake while performing an unprotected left turn, ultimately leading to the crash [Article 131474]. (b) The software failure incident related to accidental factors is also apparent in the articles. The crash involving Cruise's self-driving vehicle occurred when the vehicle making a left turn stopped in the intersection, thinking that an oncoming vehicle would turn in front of it. However, the oncoming vehicle instead drove straight, striking the Cruise vehicle. Cruise spokeswoman Hannah Lindow declined to specify what the Cruise vehicle could have done differently, indicating that the incident was not intentional but rather a result of unexpected behavior by the other vehicle [Article 131878]. |
Duration | temporary | The software failure incident related to the Cruise autonomous vehicles in San Francisco can be categorized as a temporary failure. The incident occurred due to a specific scenario where the software incorrectly predicted the path of an oncoming vehicle during an unprotected left turn, leading to a crash [130924, 131878, 131474]. After the incident, Cruise implemented a software update to address this specific issue, indicating that the failure was due to contributing factors introduced by certain circumstances but not all, making it a temporary failure. |
Behaviour | crash, omission, value, other | (a) crash: The software failure incident in the articles can be categorized as a crash. The incident involved a crash between a Cruise autonomous vehicle and another vehicle, resulting in injuries to two individuals [130924, 131878, 131474]. (b) omission: The software failure incident can also be related to omission. Cruise's software failed to predict the correct behavior of an oncoming vehicle, leading to the crash. The software omitted to accurately anticipate the path of the other vehicle, causing the autonomous vehicle to make a faulty decision [130924, 131878, 131474]. (c) timing: The timing of the software failure incident is not explicitly mentioned in the articles. The focus is more on the nature of the failure itself rather than the timing of when the failure occurred. (d) value: The software failure incident can be linked to a failure in value. The software incorrectly predicted the path of the oncoming vehicle, leading to a decision that resulted in a crash. This incorrect prediction of value in terms of the path of the other vehicle contributed to the incident [130924, 131878, 131474]. (e) byzantine: The software failure incident does not exhibit characteristics of a byzantine failure. The incident primarily revolves around a specific scenario where the software failed to predict the correct behavior of another vehicle, rather than showing inconsistent responses or interactions. (f) other: The behavior of the software failure incident can be described as a failure in decision-making. The software made a faulty decision based on its prediction of the other vehicle's behavior, ultimately leading to the crash. This aspect of decision-making failure is evident in the incident [130924, 131878, 131474]. |
Layer | Option | Rationale |
---|---|---|
Perception | sensor, embedded_software | (a) sensor: The software failure incident involving Cruise's autonomous vehicles in San Francisco was related to the perception layer of the cyber physical system that failed due to contributing factors introduced by sensor error. The incident occurred when the autonomous vehicle's software incorrectly predicted the path of an oncoming vehicle, leading to a crash. Cruise acknowledged that its robotaxi was not "sufficiently reactive" and that the recalled software could "incorrectly predict" another vehicle's path [Article 131878]. (b) actuator: The software failure incident did not specifically mention any issues related to the actuator layer of the cyber physical system. (c) processing_unit: The software failure incident did not specifically mention any issues related to the processing unit layer of the cyber physical system. (d) network_communication: The software failure incident did not specifically mention any issues related to the network communication layer of the cyber physical system. (e) embedded_software: The software failure incident was directly related to the embedded software used in Cruise's autonomous vehicles. Cruise recalled and updated the software in 80 self-driving vehicles after the crash in San Francisco, which was caused by a software error that led to the autonomous vehicle incorrectly predicting the path of another vehicle. The software update aimed to improve the robotaxis' ability to predict other vehicles' actions [Article 131474]. |
Communication | unknown | The software failure incident reported in the articles does not specifically mention whether the failure was related to the communication layer of the cyber physical system that failed at the link_level or connectivity_level. The focus of the incident was on the software error in predicting the path of an oncoming vehicle during an unprotected left turn, leading to a crash involving a Cruise autonomous vehicle in San Francisco [130924, 131878, 131474]. |
Application | TRUE | [130924, 131878, 131474] The software failure incident involving Cruise's self-driving vehicles in San Francisco, which led to a recall and software update, was related to the application layer of the cyber physical system. The failure was attributed to the software incorrectly predicting an oncoming vehicle's path or being insufficiently reactive to sudden path changes of road users during unprotected left turns. This issue was addressed through a software update that improved the vehicles' ability to predict other vehicles' actions [131474]. Cruise acknowledged that the software was not "sufficiently reactive" in the scenario that led to the crash, indicating a failure related to the application layer [131878]. The company mentioned that the software update enhanced the self-driving software's predictions, especially in situations similar to the one that caused the crash, further confirming the failure at the application layer [130924]. |
Category | Option | Rationale |
---|---|---|
Consequence | harm | The consequence of the software failure incident related to harm. Two people were injured in the June crash involving a Cruise car operating autonomously in San Francisco [130924, 131878, 131474]. |
Domain | transportation, health | (a) The failed system was related to the transportation industry, specifically autonomous driving. The software failure incident involved Cruise, a company operating self-driving vehicles in San Francisco, which had to recall and update software on 80 vehicles after a crash that resulted in injuries [130924, 131878, 131474]. (j) The failed system was also related to the health industry indirectly as the incident involved injuries to individuals in the crash caused by the software failure in the autonomous driving system [130924, 131878, 131474]. |
Article ID: 130924
Article ID: 131878
Article ID: 131474