Published Date: 2016-02-29
Postmortem Analysis | |
---|---|
Timeline | 1. The software failure incident involving a Google self-driving car and a public bus occurred on February 14, 2016 [40369, 41569, 41068, 40925]. |
System | 1. Google's self-driving car software system failed to properly predict the behavior of the public transit bus, leading to a collision [40369, 41569, 41068, 40925]. 2. The software failed to handle the negotiation and interaction with other vehicles on the road, specifically large vehicles like buses [41569, 41068, 40925]. 3. The autonomous driving software failed to avoid a collision when attempting to maneuver around sandbags on the road [40925]. 4. The software did not adequately account for the behavior of buses and other large vehicles in its decision-making process [41569, 41068]. 5. The software did not prevent the self-driving car from making a risky lane change that resulted in the collision with the bus [40925]. |
Responsible Organization | 1. Google's self-driving car software - The incident involving a Google self-driving car causing a collision with a public bus was attributed to the software's decision-making and prediction errors, leading to the collision [40369, 41569, 41068, 40925]. |
Impacted Organization | 1. Google's self-driving car [40369, 41569, 41068, 40925] |
Software Causes | 1. The software cause of the failure incident was the autonomous vehicle's decision-making algorithm that led it to misjudge the bus's behavior, resulting in a collision [40369, 41569]. 2. Google acknowledged that its software needed refinement to better understand that buses and large vehicles are less likely to yield, indicating a software limitation in predicting the behavior of different types of vehicles [41569]. 3. The incident highlighted a software misunderstanding between the autonomous vehicle and the bus driver, where both parties made assumptions based on their respective algorithms, leading to a collision [40925]. |
Non-software Causes | 1. The Google autonomous car changed lanes to get around sandbags blocking its path, leading to the collision with the bus [40369, 41569, 41068, 40925]. 2. The Google car's test driver believed the bus would yield and did not intervene to prevent the collision [41569, 41068, 40925]. 3. Misunderstandings and assumptions between the Google car, the test driver, and the bus driver contributed to the incident [40925]. 4. The Google car was following a recent change in its programming to hug the far side of the right-turn lane, which led to the collision [40925]. |
Impacts | 1. The Google self-driving car caused a collision with a public bus, resulting in damage to the car's front left side, flattened tire, and the radar sensor being torn off [41569]. 2. The collision led to minor damage to the bus, specifically striking the "pivoting joint" or flexible area in the middle of the articulated bus [41068]. 3. The incident caused the Google car's left front to strike the right side of the bus, resulting in damage to the left front fender, front wheel, and a driver-side sensor of the car [41068]. 4. The collision caused a long scratch mark on the side of the bus and the radar unit of the car ended up wedged in the crack where two side passenger doors of the bus join, cracking a glass panel [41569]. |
Preventions | 1. Improved software algorithms to better predict the behavior of buses and other large vehicles on the road could have prevented the incident [41569]. 2. Enhanced communication between the autonomous vehicle's software and the test driver to ensure better coordination and intervention in critical situations could have helped avoid the collision [41068]. 3. Implementation of more advanced sensors and cameras to provide a more comprehensive view of the surroundings and potential obstacles, such as sandbags, could have aided in preventing the crash [40925]. |
Fixes | 1. Implementing software refinements: Google mentioned that they have reviewed the incident in detail and made refinements to their software to better understand that buses and large vehicles are less likely to yield, aiming to handle such situations more gracefully in the future [41068]. 2. Enhancing the software's ability to predict and react to other vehicles: Google stated that their car had detected the approaching bus but incorrectly predicted that it would yield, leading to the collision. Improving the software's predictive capabilities and decision-making processes regarding other vehicles could help prevent similar incidents [40925]. 3. Continuous testing and simulation: Google mentioned that they have reviewed thousands of variations of the incident in their simulator to refine their software. Continued testing, simulation, and learning from different scenarios can help enhance the software's performance and response in real-world situations [41068]. 4. Addressing the negotiation aspect of driving: Google highlighted that the incident was a result of a misunderstanding that commonly occurs between human drivers on the road. By further refining the software to better understand and navigate negotiation scenarios on the road, the likelihood of collisions could be reduced [40925]. | References | 1. Accident report filed with the California DMV [40369, 41569] 2. Santa Clara Valley Transportation Authority [41569] 3. Google's statement [41068, 40925] 4. Spokespeople from various organizations such as the Santa Clara Valley Transportation Authority, DMV, and Google [41068, 40925] 5. Independent claims adjuster [41569] 6. Jessica Gonzalez, a spokeswoman from the DMV [40925] 7. Hilary Rowen, a partner at Sedgwick LLP [40925] |
Category | Option | Rationale |
---|---|---|
Recurring | one_organization | (a) The software failure incident having happened again at one_organization: - Google's self-driving car caused a crash with a public bus in Mountain View, resulting in damage to the car and the bus [40369, 41569, 41068, 40925]. - Google accepted partial responsibility for the crash, acknowledging that its car made a mistake that led to the collision [41569, 41068, 40925]. - This was the first time Google acknowledged its car's mistake causing a collision, despite previous accidents being attributed to human error [41569, 40925]. - Google stated that it made changes to its software after the crash to prevent similar incidents in the future [41068]. - The incident involved a Google self-driving car attempting to navigate around sandbags on the road, leading to a collision with a public bus [40925]. (b) The software failure incident having happened again at multiple_organization: - There is no information in the provided articles about similar incidents happening at other organizations or with their products and services. |
Phase (Design/Operation) | design, operation | (a) In the reported software failure incident related to the Google self-driving car crash with a public bus, the incident can be attributed to a design failure. The incident occurred when the self-driving car was trying to get around some sandbags on a street, leading to a collision with the bus [40369, 41569, 41068, 40925]. This design flaw in the software's decision-making process and response to obstacles ultimately resulted in the crash. (b) Additionally, the incident can also be linked to an operational failure. The test driver of the self-driving car, who was required to be in the front seat to take control in emergencies, did not intervene to prevent the collision with the bus. The driver's expectation that the bus would yield, coupled with the software's prediction that the bus would slow down, contributed to the operational failure that led to the crash [40369, 41569, 41068, 40925]. |
Boundary (Internal/External) | within_system, outside_system | (a) The software failure incident involving Google's self-driving car colliding with a public bus can be categorized as a within_system failure. The incident occurred when the autonomous vehicle, operating in autonomous mode, attempted to navigate around sandbags on the road and misjudged the bus's behavior, leading to a collision [40369, 41569, 41068, 40925]. The software controlling the car was programmed to follow traffic laws and drive conservatively, but it failed to accurately predict the bus's actions, resulting in the collision. Google acknowledged that its software needed refinement to better understand that buses and large vehicles are less likely to yield compared to other vehicles [41569, 41068]. The incident highlighted the need for adjustments in the software to handle such scenarios more effectively in the future [40925]. (b) The software failure incident can also be attributed to contributing factors originating from outside the system. For example, the incident involved a complex interaction between the autonomous vehicle and the bus driver, where assumptions were made by both parties regarding each other's actions on the road [40925]. Additionally, the incident report did not assign fault to any specific party, indicating that external factors such as human drivers' behaviors and road conditions played a role in the collision [40369]. The involvement of external entities like the bus driver, other vehicles on the road, and the physical environment influenced the outcome of the software failure incident. |
Nature (Human/Non-human) | non-human_actions, human_actions | (a) The software failure incident occurring due to non-human actions: - The incident involved a Google self-driving car colliding with a public bus in Mountain View. The car was trying to get around some sandbags on a street when its left front struck the right side of the bus, leading to the collision [40925]. - Google's car was in autonomous mode and driving at 2 mph when it made contact with the side of the bus, causing damage to the left front fender, front wheel, and a driver-side sensor. The collision occurred as the Google car re-entered the center of the lane after stopping for sandbags, and both the car's software and the person in the driver's seat believed the bus would yield [41068]. - The collision was captured on video, showing the Google self-driving car edging into the path of the bus that was rolling by at about 15 mph. The footage revealed that the car's decision to try and slip in front of the bus was surprising given the law-abiding algorithms controlling the car's onboard computer [41569]. (b) The software failure incident occurring due to human actions: - Google acknowledged that it bore some responsibility for the collision with the public bus, stating that if their car hadn't moved, there wouldn't have been a collision. The test driver believed the bus would slow or stop to allow the Google vehicle to merge into traffic, contributing to the incident [41068]. - The Google car's test driver, who was required to be in the front seat to take control in an emergency, did not intervene before the collision as they believed the bus would yield. The incident was described as a misunderstanding that happens between human drivers on the road every day [40925]. - The incident report highlighted that the Google car's test driver and the car's software both expected the bus to yield, leading to the collision. Google mentioned that they have refined their software to better understand that buses and other large vehicles are less likely to yield in such situations [41569]. |
Dimension (Hardware/Software) | hardware, software | (a) The software failure incident occurring due to hardware: - The incident involving a Google self-driving car colliding with a public bus in Mountain View was attributed to the car trying to get around sandbags on the street, leading to the collision [40925]. - The collision caused damage to the left front fender, front wheel, and a driver-side sensor of the Google car, as well as damage to the bus [41068]. - The impact crumpled the front left side of the Lexus, flattened the tire, and tore off the radar Google had installed to help the SUV perceive its surroundings [41569]. (b) The software failure incident occurring due to software: - Google mentioned that it reviewed the incident in detail and made refinements to its software to avoid similar incidents in the future [41068]. - Google stated that it has refined its software following the incident, acknowledging that buses and other large vehicles are less likely to yield, and aims to handle such situations more gracefully in the future [40925]. - Google mentioned that its cars have been involved in accidents before, all caused by human error, indicating that the software itself was not the direct cause of those incidents [40369]. |
Objective (Malicious/Non-malicious) | non-malicious | (a) The articles do not mention any malicious intent behind the software failure incident reported. [40369, 41569, 41068, 40925] (b) The software failure incident was non-malicious, occurring due to contributing factors introduced without intent to harm the system. The incident involved a Google self-driving car colliding with a public bus due to a misunderstanding and miscalculation of the bus's actions by the autonomous vehicle's software and test driver. Google acknowledged its responsibility in the incident and made changes to its software to better handle similar situations in the future. The incident was described as a result of a "misunderstanding" that can happen between human drivers on the road every day. The software failure was a consequence of the autonomous vehicle's decision-making process and interaction with the bus, rather than any malicious intent. |
Intent (Poor/Accidental Decisions) | poor_decisions, accidental_decisions | (a) poor_decisions: The software failure incident involving Google's self-driving car colliding with a public bus was due to poor decisions made by the autonomous vehicle's software and the test driver. The incident occurred when the car attempted to get around sandbags on the road and misjudged the bus's behavior, leading to a collision [40369, 41569, 41068, 40925]. (b) accidental_decisions: The software failure incident can also be attributed to accidental decisions or unintended consequences. The car's software and the test driver did not anticipate the bus's actions correctly, resulting in the collision. It was described as a misunderstanding that can happen between human drivers on the road as well [40369, 41569, 41068, 40925]. |
Capability (Incompetence/Accidental) | development_incompetence | (a) The software failure incident occurring due to development incompetence: - The incident involving Google's self-driving car colliding with a public bus was attributed to a mistake made by the autonomous vehicle's software and the test driver's assumptions. Google acknowledged that they bear some responsibility for the crash, stating that if their car hadn't moved, the collision wouldn't have occurred [40369, 41569, 41068, 40925]. (b) The software failure incident occurring accidentally: - The incident was described as a misunderstanding that can happen between human drivers on the road every day, indicating that it was an accidental outcome of various assumptions made by both the autonomous vehicle's software and the human test driver [40925]. |
Duration | temporary | (a) The software failure incident in the articles appears to be temporary. The incident was a result of specific circumstances such as the car trying to get around sandbags on the street, misjudgment of the bus driver's actions, and the car's prediction that the bus would yield. Google mentioned that they have refined their software to better understand that buses and large vehicles are less likely to yield in such situations [40369, 41569, 41068, 40925]. |
Behaviour | crash, omission, other | (a) crash: The software failure incident in the articles can be categorized as a crash. The incident involved a Google self-driving car colliding with a public bus, resulting in damage to the car and the bus. The crash occurred when the autonomous vehicle attempted to get around some sandbags on the street but ended up striking the side of the bus [40369, 41569, 41068, 40925]. (b) omission: The incident can also be related to omission as the autonomous vehicle failed to perform its intended function of avoiding a collision with the bus. The car's software and the test driver believed the bus would yield, but the collision occurred when the car re-entered the center of the lane and struck the bus [41569, 41068, 40925]. (c) timing: The timing of the incident could be considered a factor in the failure. The car was moving at 2 mph while the bus was traveling at 15 mph. The collision occurred when the car re-entered the lane, possibly at an incorrect time in relation to the bus's movement [40369, 41068, 40925]. (d) value: The incident does not directly relate to a failure due to the system performing its intended functions incorrectly. (e) byzantine: The incident does not exhibit characteristics of a byzantine failure where the system behaves erroneously with inconsistent responses and interactions. (f) other: The behavior of the software failure incident could also be described as a misjudgment or miscalculation. Both the car's software and the test driver misjudged the bus's actions, leading to the collision. The incident highlights the challenges of predicting and reacting to the behavior of other vehicles on the road [40369, 41569, 41068, 40925]. |
Layer | Option | Rationale |
---|---|---|
Perception | sensor, actuator, embedded_software | (a) sensor: - The incident involved the Google self-driving car trying to get around some sandbags on a street when its left front struck the right side of the bus, indicating a sensor-related issue [40925]. - The collision happened when the Google autonomous vehicle was re-entering the center of the lane and made contact with the side of the bus, suggesting a sensor failure in detecting the bus's position [40369]. - The impact crumpled the Lexus' front left side, flattened the tire, and tore off the radar Google had installed to help the SUV perceive its surroundings, indicating a sensor failure [41569]. (b) actuator: - The incident involved the Google self-driving car striking a municipal bus when it sought to get around some sandbags in a wide lane, suggesting an actuator-related issue [41068]. - The Google car in autonomous mode re-entered the center of the lane and struck the side of the bus, causing damage to the left front fender, front wheel, and a driver-side sensor, indicating a potential actuator failure [41068]. (c) processing_unit: - The incident report did not specifically mention any failures related to the processing unit of the self-driving car [unknown]. (d) network_communication: - The articles did not mention any failures related to network communication contributing to the incident [unknown]. (e) embedded_software: - Google mentioned that they have refined their software following the incident, indicating a potential issue with the embedded software [40925]. - Google stated that they have made changes to their software after the crash to avoid future incidents, suggesting a potential issue with the embedded software [41068]. |
Communication | unknown | Unknown |
Application | FALSE | [40369, 41569, 41068, 40925] The software failure incident related to the Google self-driving car colliding with a public bus in Mountain View was not directly related to the application layer of the cyber physical system. The incident was primarily attributed to the autonomous vehicle's decision-making process, misjudgment of the bus's behavior, and the interaction between the self-driving car and the bus, rather than being caused by bugs, operating system errors, unhandled exceptions, or incorrect usage at the application layer. |
Category | Option | Rationale |
---|---|---|
Consequence | property, delay, non-human, theoretical_consequence | (a) death: People lost their lives due to the software failure - There were no reports of deaths related to the software failure incident involving Google's self-driving car [40369, 41569, 41068, 40925]. (b) harm: People were physically harmed due to the software failure - No individuals were physically harmed as a result of the software failure incident involving Google's self-driving car [40369, 41569, 41068, 40925]. (c) basic: People's access to food or shelter was impacted because of the software failure - There is no mention of people's access to food or shelter being impacted by the software failure incident [40369, 41569, 41068, 40925]. (d) property: People's material goods, money, or data was impacted due to the software failure - The software failure incident involving Google's self-driving car resulted in damage to the front left wheel and fender of the car, as well as damage to the bus involved in the collision [40369, 41569, 41068, 40925]. (e) delay: People had to postpone an activity due to the software failure - The incident caused a delay in the transportation of the 15 passengers on the bus, as they had to be transferred to another bus after the collision [41569]. (f) non-human: Non-human entities were impacted due to the software failure - The software failure incident involved damage to the Google self-driving car and the public bus, both non-human entities [40369, 41569, 41068, 40925]. (g) no_consequence: There were no real observed consequences of the software failure - The software failure incident resulted in physical damage to the vehicles involved, indicating a consequence of the incident [40369, 41569, 41068, 40925]. (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur - Theoretical consequences discussed included the potential for the software to be at fault for the collision, the need for liability determination, and the refinement of software to prevent future incidents [40369, 41569, 41068, 40925]. (i) other: Was there consequence(s) of the software failure not described in the (a to h) options? What is the other consequence(s)? - There were no other consequences mentioned in the articles related to the software failure incident involving Google's self-driving car [40369, 41569, 41068, 40925]. |
Domain | information, transportation | (a) The failed system was related to the information industry as it involved Google's self-driving car technology, which is aimed at revolutionizing transportation through autonomous vehicles [40369, 41569, 41068, 40925]. (b) The transportation industry was impacted by the software failure incident as it involved a collision between Google's self-driving car and a public bus, highlighting challenges in autonomous vehicle technology [40369, 41569, 41068, 40925]. (m) The software failure incident does not directly relate to any other industry beyond information and transportation as discussed in the articles [40369, 41569, 41068, 40925]. |
Article ID: 40369
Article ID: 41569
Article ID: 41068
Article ID: 40925