Incident: Tesla Autopilot Failure Leads to Fatal Crash Impacting Driver Safety

Published Date: 2018-03-30

Postmortem Analysis
Timeline 1. The software failure incident involving the Tesla Model X SUV crashing while the Autopilot feature was turned on occurred on the morning of Friday, March 23, as reported in Article 68923. 2. Published on 2018-03-30. Therefore, the software failure incident happened in March 2018.
System 1. Tesla's Autopilot feature [68923]
Responsible Organization 1. The driver, Wei Huang, who did not keep his hands on the wheel and ignored warnings from the Autopilot system [68923].
Impacted Organization 1. The driver, Wei Huang, who died in the crash [68923] 2. Tesla, the company behind the Autopilot feature that was controlling the car during the incident [68923]
Software Causes 1. The software cause of the failure incident was related to Tesla's Autopilot feature, which was turned on when the Model X SUV crashed into a concrete highway lane divider, resulting in a fatal accident [68923].
Non-software Causes 1. The concrete highway lane divider that the Tesla Model X SUV slammed into lacked a crash attenuator, which had been crushed in a previous accident and not replaced [68923].
Impacts 1. The software failure incident involving Tesla's Autopilot system resulted in a fatal crash where the driver, Wei Huang, died after his Model X SUV slammed into a concrete highway lane divider and burst into flames [68923]. 2. The incident raised questions about the reliability and limitations of Tesla's Autopilot system, highlighting the need for constant human supervision while using the feature [68923]. 3. The National Highway Traffic Safety Administration concluded that the Autopilot system was operating as intended and not defective, attributing the crash to the driver's actions [68923]. 4. The National Transportation Safety Board placed some blame on Tesla for selling a system that could be easily misused, indicating a need for better safeguards and warnings in the software [68923]. 5. Following the incident, Tesla made modifications to the Autopilot system, including relying more on radar data, introducing brighter warnings, and limiting the time a driver can let go of the wheel [68923]. 6. The crash involving Tesla's Autopilot system, along with other incidents in the autonomous vehicle industry, marked a challenging period for the development and adoption of self-driving technology, emphasizing the current limitations of such systems [68923].
Preventions 1. Implement stricter monitoring mechanisms to ensure drivers are actively engaged and ready to take control of the vehicle when necessary [68923]. 2. Enhance the Autopilot system to better detect and react to stationary objects, such as stopped vehicles, to prevent collisions [68923]. 3. Improve the Autopilot system's ability to handle situations where lane markings disappear or lanes split to avoid accidents in such scenarios [68923].
Fixes 1. Implement stricter monitoring systems to ensure drivers are actively engaged and ready to take control of the vehicle when necessary [68923]. 2. Enhance the Autopilot system to better detect and react to stationary objects, such as stopped vehicles, to prevent collisions [68923]. 3. Improve the warning mechanisms to alert drivers more effectively when they need to intervene, possibly with more frequent and attention-grabbing alerts [68923]. 4. Conduct thorough investigations into each incident to identify system weaknesses and make necessary adjustments to prevent similar failures in the future [68923].
References 1. Tesla's blog post [68923] 2. National Highway Traffic Safety Administration [68923] 3. National Transportation Safety Board [68923]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to Tesla's Autopilot system causing a fatal crash has happened again within the same organization. This incident involving a Tesla Model X SUV crashing into a concrete highway lane divider while Autopilot was engaged is the second confirmed fatal crash on US roads in which Tesla's Autopilot system was controlling the car [68923]. (b) The software failure incident involving Tesla's Autopilot system causing fatal crashes has also happened at other organizations or with their products and services. The article mentions another fatal crash involving a Tesla Model S using Autopilot in which the system failed to detect a white truck against a bright sky, resulting in a fatal collision [68923].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the case of the Tesla Autopilot system. The article mentions that the Autopilot system, which combines radar-controlled cruise control with automatic steering, has known weaknesses such as not being able to see stationary objects. This limitation was highlighted when a Tesla crashed into a stopped firetruck near Los Angeles [68923]. Additionally, the National Transportation Safety Board criticized Tesla for selling a system that is too easy to misuse, indicating design flaws in the system [68923]. (b) The software failure incident related to the operation phase is evident in the case of the Tesla Model X crash where the driver, Wei Huang, was using the Autopilot feature. Despite warnings and reminders to keep hands on the wheel and monitor the road, the driver's hands were not detected on the wheel for six seconds prior to the impact. This indicates a failure in the operation or misuse of the Autopilot system by the driver [68923].
Boundary (Internal/External) within_system, outside_system (a) The software failure incident involving the Tesla Model X crashing into a concrete highway lane divider while using the Autopilot feature can be categorized as a within_system failure. The incident was attributed to factors within the system, such as the limitations and weaknesses of the Autopilot system itself. The article mentions that the Autopilot system may not see stationary objects, which was highlighted in a previous incident where a Tesla crashed into a stopped firetruck [68923]. Additionally, the National Transportation Safety Board stated that Tesla should bear some of the blame for selling a system that is too easy to misuse, indicating internal system issues [68923]. (b) On the other hand, external factors also played a role in the software failure incident. For example, the article mentions that the concrete barrier that the Tesla Model X hit was supposed to have a crash attenuator, which had been crushed in a previous accident and not replaced. This external factor contributed to the severity of the crash and the damage to the vehicle [68923].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident related to non-human actions in the article is the fatal crash involving a Tesla Model X SUV with Autopilot engaged. The crash occurred when the vehicle slammed into a concrete highway lane divider and burst into flames. The investigation revealed that the crash was partly due to the concrete barrier lacking a crash attenuator, which had been crushed in a previous accident and not replaced [68923]. (b) The software failure incident related to human actions in the article involves the driver, Wei Huang, who was using Tesla's Autopilot feature at the time of the crash. It was reported that Huang's hands were not detected on the wheel for six seconds prior to the impact, despite receiving multiple warnings to put his hands back on the wheel. The article highlights that drivers need to be ready to take control if lane markings disappear or lanes split, indicating the importance of human supervision while using the Autopilot system [68923].
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The article mentions that the concrete highway lane divider that the Tesla Model X SUV slammed into was supposed to have a crash attenuator, which crumples to absorb some of the impact. However, it had been crushed in a previous accident and not replaced, contributing to the severity of the crash [68923]. (b) The software failure incident related to software: - The incident involving the Tesla Model X SUV crashing into a concrete highway lane divider was linked to the Autopilot feature being turned on. The Autopilot system, which is a software-driven semi-autonomous driving system, was controlling the car at the time of the crash. The system relies on constant human supervision, and the driver is supposed to keep their hands on the wheel and monitor the road. Ignoring warnings and not following the system's guidelines can lead to accidents, as seen in this case [68923].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident related to the Tesla Model X crash involving the Autopilot system does not appear to be malicious. The incident was a result of the limitations and weaknesses of the Autopilot system, as well as the driver's failure to adhere to the system's requirements for human supervision and intervention. The crash was attributed to factors such as the driver not having his hands on the wheel, the system not detecting stationary objects, and the driver potentially being distracted or over-relying on the system's capabilities [68923]. (b) The software failure incident can be categorized as non-malicious, as it was not caused by intentional actions to harm the system but rather by a combination of system limitations, human error, and environmental factors. The incident highlights the challenges and risks associated with semi-autonomous driving systems and the importance of maintaining human oversight and responsibility while using such technologies [68923].
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The software failure incident involving the Tesla Model X crash while using Autopilot can be attributed to poor_decisions. The incident highlights how the Autopilot system, although designed as a driver assistance tool, can potentially lull drivers into a false sense of security, leading to distractions and lack of proper supervision [68923]. (b) On the other hand, the incident also involves accidental_decisions as the driver, Wei Huang, failed to keep his hands on the wheel and ignored multiple warnings from the system to do so. This unintentional decision to not follow the system's instructions ultimately contributed to the fatal crash [68923].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident related to development incompetence can be seen in the case of the Tesla Autopilot system. The incident involving the fatal crash of a Model X SUV was attributed to the driver's misuse of the Autopilot system, which led to the vehicle slamming into a concrete highway lane divider [68923]. The system, although designed to assist drivers, requires constant human supervision and intervention. The National Transportation Safety Board also criticized Tesla for selling a system that is too easy to misuse, indicating a potential flaw in the system's design that could contribute to accidents [68923]. (b) The accidental aspect of the software failure incident can be observed in the case of the crash involving the Tesla Model X SUV. The incident was partly attributed to the barrier that the vehicle hit, which was supposed to have a crash attenuator but had been crushed in a previous accident and not replaced [68923]. This accidental factor, the absence of the crash attenuator, contributed to the severity of the impact and the resulting damage to the vehicle.
Duration permanent, temporary (a) The software failure incident in the article is more of a permanent nature. The incident involving the Tesla Model X crashing into a concrete highway lane divider while using the Autopilot feature highlights the inherent weaknesses and limitations of the system. The article mentions that the Autopilot system, although designed to assist drivers, can lull them into a false sense of security, potentially leading to accidents if drivers do not maintain constant vigilance and control [68923]. (b) However, there are also elements of a temporary nature in the software failure incident. For example, after the first fatal crash involving Autopilot, Tesla made modifications to the system through software updates to address some of the issues. Changes included relying more on radar data, introducing brighter warnings, and limiting the time a driver can let go of the wheel. These adjustments indicate that the software failure was not entirely permanent, as measures were taken to improve the system's performance and address specific contributing factors [68923].
Behaviour omission, timing, value, other (a) crash: The software failure incident in the article is related to a crash. The Tesla Model X SUV crashed into a concrete highway lane divider while the Autopilot feature was turned on, resulting in a fatal accident [68923]. (b) omission: The system's failure in this incident can also be attributed to omission. The Autopilot system failed to detect the driver's hands on the wheel for six seconds prior to the impact, despite providing multiple warnings to put hands back on the wheel [68923]. (c) timing: The timing of the system's response can be considered a factor in this incident. The driver was given warnings to put his hands back on the wheel, but the system's response may have been perceived as too late as it did not prevent the crash from occurring [68923]. (d) value: The failure can also be linked to the system performing its intended functions incorrectly. Despite being designed to keep the car in its lane and maintain a safe distance from other vehicles, the system did not prevent the fatal crash in this instance [68923]. (e) byzantine: The behavior of the system in this incident does not align with a byzantine failure, as there is no mention of inconsistent responses or interactions. The failure seems more straightforward in terms of the system's inability to prevent the crash despite warnings and features designed to enhance safety [68923]. (f) other: The other behavior observed in this incident could be related to the system lulling the driver into a false sense of security. Critics mentioned that the ease with which Tesla's Autopilot system handles regular freeway driving can lead drivers to believe it is more capable than it actually is, potentially contributing to accidents like the one described [68923].

IoT System Layer

Layer Option Rationale
Perception sensor, processing_unit, embedded_software (a) sensor: The article mentions that Tesla's Autopilot system may not see stationary objects, which was highlighted when a Tesla slammed into a stopped firetruck near Los Angeles in January. The system is designed to discard radar data about things that aren’t moving to prevent false alarms [68923]. (b) actuator: The article does not specifically mention any failures related to actuators. (c) processing_unit: The article discusses how Tesla modified Autopilot after a fatal crash to rely more on data from radar and less on the camera to spot obstacles in the car's path. They also sent out a software update that curtailed the length of time a driver can let go of the wheel and introduced brighter, flashing warnings [68923]. (d) network_communication: The article does not provide information about failures related to network communication. (e) embedded_software: The article discusses how Tesla introduced Autopilot via over-the-air software updates and modified it after a fatal crash to rely more on radar data. They also sent out a software update to address issues with driver supervision and warnings [68923].
Communication unknown The failure incident reported in the articles does not directly relate to a failure at the communication layer of the cyber-physical system. The incident primarily involves the failure of Tesla's Autopilot system in a fatal crash scenario, where the driver did not maintain proper supervision and control as required by the system's design and warnings. The failure is more related to the limitations and misuse of the driver assistance system rather than a failure at the communication layer of the cyber-physical system. Therefore, the options of link_level and connectivity_level are not applicable in this context.
Application TRUE The software failure incident involving Tesla's Autopilot system, specifically in the case of the fatal crash of a Model X SUV, can be attributed to the application layer of the cyber physical system. This failure was related to contributing factors introduced by bugs, operating system errors, unhandled exceptions, and incorrect usage. The incident highlighted issues such as the driver not keeping hands on the wheel despite warnings, the system not detecting stationary objects, and the limitations of the Autopilot system in handling unexpected scenarios like disappearing lane markings or split lanes [68923]. These factors point towards a failure at the application layer of the system due to incorrect usage and limitations in handling various scenarios.

Other Details

Category Option Rationale
Consequence death, harm, property (a) death: The consequence of the software failure incident reported in the news article was the death of the driver, Wei Huang, who was using Tesla's Autopilot feature when his Model X SUV crashed into a concrete highway lane divider and burst into flames. This incident marked the second confirmed fatal crash on US roads involving Tesla's Autopilot system [68923].
Domain transportation (a) The failed system in the incident was related to the transportation industry. The incident involved Tesla's Autopilot feature controlling a Model X SUV, which ultimately crashed into a concrete highway lane divider [Article 68923]. The Autopilot system is designed to assist drivers in controlling the vehicle, emphasizing the transportation aspect of the industry.

Sources

Back to List