Incident: Tesla Autopilot Accidents: Self-Driving Feature Failures and Investigations

Published Date: 2021-08-17

Postmortem Analysis
Timeline 1. The software failure incidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles occurred between January 22, 2018, and July 10, 2021 [117484]. 2. The article was published on August 17, 2021 [117484]. 3. Estimated timeline of the incident: - The incidents occurred between January 22, 2018, and July 10, 2021 [117484]. - The article was published on August 17, 2021 [117484]. - Therefore, the software failure incidents happened between January 2018 and July 2021.
System 1. Tesla's Autopilot feature 2. Tesla's traffic-aware cruise control 3. Tesla Model Y, X, S, and 3 with model years 2014 to 2021 [117484]
Responsible Organization 1. Tesla's Autopilot feature and traffic-aware cruise control were responsible for causing the software failure incidents reported in the news article [117484].
Impacted Organization 1. Emergency vehicles involved in the accidents [117484] 2. Tesla drivers and passengers in the vehicles involved in the accidents [117484] 3. Other non-Tesla drivers on the road who could be injured by cars using the Autopilot feature [117484]
Software Causes 1. The failure incidents involving Tesla cars using Autopilot or other self-driving features crashing into emergency vehicles were caused by the software's inability to properly detect and respond to stationary objects, such as first responder vehicles, flares, illuminated arrow boards, and road cones [117484]. 2. The software's design flaw in ignoring stationary objects when traveling at speeds over 40 mph led to the failure incidents, as it did not appropriately slow down or respond to the presence of these objects [117484]. 3. The software's limitations in understanding and reacting to visual cues at accident scenes, such as road flares and flashing lights, contributed to the failure incidents, as these cues were not effectively processed by the autonomous driving system [117484].
Non-software Causes 1. Lack of human driver supervision while using the Autopilot feature [117484] 2. Misunderstanding or overestimation of the capabilities of the self-driving technology by Tesla owners [117484] 3. Design limitations in the self-driving technology, such as ignoring stationary objects when traveling at high speeds [117484]
Impacts 1. The software failure incident involving Tesla cars using Autopilot or other self-driving features resulted in at least 11 accidents, with seven of these accidents causing 17 injuries and one death [117484]. 2. Following news of the probe by the National Highway Transportation Safety Administration, Tesla's stock fell by 5% in morning trading [117484]. 3. The incident raised concerns about the safety of Tesla's Autopilot feature, with previous incidents like the 2018 fatal crash in Florida partly blaming Autopilot for the accident [117484]. 4. The investigation by the safety agency aims to better understand the causes of the Tesla crashes and the technologies used to monitor, assist, and enforce driver engagement while Autopilot is in use [117484]. 5. Analysts and critics have highlighted the potential danger not only to Autopilot users but also to other non-Tesla drivers on the road who could be at risk due to cars using the feature [117484].
Preventions 1. Implementing stricter driver supervision requirements for the Autopilot feature to ensure active driver engagement at all times could have prevented the software failure incident [117484]. 2. Enhancing the technology and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use could have helped prevent the accidents involving Tesla vehicles and emergency vehicles [117484]. 3. Improving the adaptive cruise control and self-driving technology to better detect and respond to stationary objects, especially in emergency situations, could have mitigated the risks associated with the Autopilot feature [117484].
Fixes 1. Implement stricter driver supervision requirements for the Autopilot feature to ensure active driver engagement at all times [117484]. 2. Enhance the technology and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. 3. Improve the system's ability to detect and appropriately respond to stationary objects, especially in emergency situations [117484].
References 1. National Highway Transportation Safety Administration (NHTSA) [Article 117484] 2. National Transportation Safety Board [Article 117484] 3. Police in a Houston suburb [Article 117484] 4. Lars Moravy, Tesla’s vice president of vehicle engineering [Article 117484] 5. Gordon Johnson, analyst and vocal critic of Tesla [Article 117484] 6. Sam Abuelsamid, expert in self-driving vehicles and principal analyst at Guidehouse Insights [Article 117484]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) In the provided article, the incident involving Tesla cars using Autopilot or other self-driving features crashing into emergency vehicles is not the first time such safety concerns have been raised about Tesla's Autopilot feature. The National Transportation Safety Board had previously found Autopilot partly to blame in a 2018 fatal crash in Florida that killed a Tesla driver. Additionally, there was a case in a Houston suburb where a Tesla crashed and killed two people earlier in the year, with the car allegedly having its adaptive cruise control engaged before the crash [117484]. (b) The article mentions that self-driving options like Tesla's Autopilot and adaptive cruise control, available on a wide range of automakers' vehicles, have features that can slow down a vehicle when necessary. However, the article also highlights concerns raised by an analyst regarding the danger that Tesla's Autopilot feature poses not only to Tesla drivers but also to other non-Tesla drivers on the road who could be injured by cars using the feature. This indicates that similar incidents or safety concerns related to self-driving features may not be limited to Tesla alone but could potentially involve other organizations and their products as well [117484].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where the National Highway Transportation Safety Administration (NHTSA) is investigating Tesla accidents involving Autopilot or other self-driving features that crashed into emergency vehicles. The investigation aims to understand the causes of these crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. (b) The software failure incident related to the operation phase is evident in the accidents where Tesla vehicles using Autopilot or traffic-aware cruise control were involved in crashes with emergency vehicles. These incidents occurred when the self-driving features were engaged as the vehicles approached the crash scenes, indicating a failure in the operation or misuse of the system [117484].
Boundary (Internal/External) within_system, outside_system (a) within_system: - The software failure incident involving Tesla cars using Autopilot or other self-driving features crashing into emergency vehicles was primarily due to contributing factors originating from within the system itself. The National Highway Transportation Safety Administration (NHTSA) is investigating these accidents, which occurred when the Teslas had the self-driving Autopilot feature or traffic-aware cruise control engaged [117484]. - Tesla's Autopilot feature has been questioned before for its safety, with the NHTSA investigating to understand the causes of these crashes and the technologies used to monitor and enforce driver engagement while Autopilot is in use [117484]. - The issue raised by an analyst, Gordon Johnson, highlighted the danger that Tesla's Autopilot feature poses not only to the drivers but also to other non-Tesla drivers on the road, emphasizing the risks associated with the feature itself [117484]. (b) outside_system: - The software failure incident involving Tesla cars crashing into emergency vehicles was influenced by contributing factors originating from outside the system, such as the presence of emergency vehicles with control measures like first responder vehicle lights, flares, illuminated arrow boards, and road cones at the accident scenes [117484]. - The article mentions that the real problem lies in the fact that many Tesla owners assume their cars can drive themselves, indicating a potential lack of understanding or misinterpretation of the system's capabilities by users, which could be considered an external factor impacting the software failure incident [117484].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: The article reports on the investigation by the National Highway Transportation Safety Administration (NHTSA) into 11 accidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles. These accidents occurred when the Teslas approached the scene of earlier crashes, and the post-accident scenes included control measures like first responder vehicle lights, flares, an illuminated arrow board, and road cones. The NHTSA is looking into the causes of these crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. (b) The software failure incident occurring due to human actions: The article mentions a fatal crash in Florida in 2018 where the National Transportation Safety Board found Autopilot partly to blame. Additionally, there was a case in a Houston suburb where police reported that there was no one in the driver's seat of a Tesla that crashed and killed two people, although Tesla denied this charge. In this case, Tesla's vice president of vehicle engineering confirmed that Tesla's adaptive cruise control was engaged and accelerated before the crash. These incidents highlight the potential role of human actions in contributing to software failure incidents involving Tesla vehicles [117484].
Dimension (Hardware/Software) hardware, software (a) The software failure incident occurring due to hardware: - The article mentions that the National Highway Transportation Safety Administration is investigating accidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles. These accidents occurred when the vehicles approached the scene of an earlier crash where control measures like first responder vehicle lights, flares, an illuminated arrow board, and road cones were present [117484]. - The article also discusses how Tesla's adaptive cruise control was engaged and accelerated before a crash in which there was no one in the driver's seat, resulting in two fatalities. This incident raises concerns about the hardware and software systems in Tesla vehicles [117484]. (b) The software failure incident occurring due to software: - The article highlights that the National Transportation Safety Board found Autopilot partly to blame in a 2018 fatal crash in Florida that killed a Tesla driver, indicating a software-related issue [117484]. - There are concerns raised about the Autopilot feature and the need for active driver supervision despite Tesla's claims about the safety of cars using Autopilot. The investigation aims to understand the technologies and methods used to monitor, assist, and enforce driver engagement with driving while Autopilot is in use, suggesting potential software-related issues [117484].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident related to the Tesla accidents involving Autopilot and emergency vehicles does not seem to be malicious. The incidents were attributed to the self-driving features of the Tesla cars, particularly Autopilot and traffic-aware cruise control, being engaged as they approached the crashes. The National Highway Transportation Safety Administration (NHTSA) is investigating to better understand the causes of these crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. (b) The software failure incident can be categorized as non-malicious. The incidents were not caused by human(s) with intent to harm the system but rather by the limitations and potential errors in the self-driving features of the Tesla vehicles. The safety agency emphasized that no commercially available motor vehicles today are capable of driving themselves and that drivers must use advanced driving assistance features correctly and responsibly [117484].
Intent (Poor/Accidental Decisions) unknown The articles do not provide specific information about the intent behind the software failure incident related to Tesla vehicles using Autopilot and crashing into emergency vehicles. The incidents are being investigated by the National Highway Transportation Safety Administration to better understand the causes of the crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. The focus is on understanding the contributing factors and ensuring driver responsibility and safety rather than explicitly attributing the failure to poor or accidental decisions.
Capability (Incompetence/Accidental) development_incompetence (a) The software failure incident related to development incompetence is evident in the article as the National Highway Transportation Safety Administration (NHTSA) is investigating accidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles. The investigation aims to understand the causes of these crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use [117484]. (b) The software failure incident related to accidental factors is highlighted in the article when discussing how self-driving options like Tesla's Autopilot or adaptive cruise control are designed to ignore stationary objects when traveling at more than 40 mph to prevent unnecessary braking. However, the real problem arises when Tesla owners assume their cars can drive themselves, leading to confusion and potential errors when approaching accident sites with cues that may make more sense to humans than to the auto drive system [117484].
Duration temporary The software failure incident related to Tesla's Autopilot feature can be considered as a temporary failure. The incidents involving Tesla cars using Autopilot or other self-driving features crashing into emergency vehicles were due to contributing factors introduced by certain circumstances, such as the cars approaching the crashes with Autopilot or traffic-aware cruise control engaged [117484]. However, it's important to note that the investigation by the National Highway Transportation Safety Administration aims to better understand the causes of these Tesla crashes, including the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use. This indicates that the failure is not permanent but rather temporary, as it is being actively investigated to identify contributing factors and potential solutions [117484].
Behaviour crash (a) crash: The software failure incident in the articles can be categorized as a crash. The incidents involving Tesla cars using Autopilot or other self-driving features crashed into emergency vehicles when coming upon the scene of an earlier crash, resulting in injuries and even a death [Article 117484]. (b) omission: There is no specific mention of the software failure incident being due to the system omitting to perform its intended functions at an instance(s) in the provided article. (c) timing: The software failure incident is not described as a timing issue where the system performed its intended functions correctly but too late or too early in the provided article. (d) value: The software failure incident is not described as a failure due to the system performing its intended functions incorrectly in the provided article. (e) byzantine: The software failure incident is not described as a byzantine failure where the system behaves erroneously with inconsistent responses and interactions in the provided article. (f) other: The behavior of the software failure incident is not described as falling under the "other" category in the provided article.

IoT System Layer

Layer Option Rationale
Perception sensor, processing_unit, embedded_software (a) sensor: The article mentions that Tesla vehicles using Autopilot crashed into emergency vehicles at accident scenes. These accidents occurred mostly at night, and the post-accident scenes included control measures like first responder vehicle lights, flares, an illuminated arrow board, and road cones. This indicates that the sensors responsible for detecting these objects or obstacles may have failed to properly identify the emergency vehicles and control measures, leading to the accidents [117484]. (b) actuator: The article does not specifically mention any failures related to actuators. (c) processing_unit: The article discusses how Tesla's Autopilot feature and adaptive cruise control were engaged in the accidents. It mentions that the National Highway Transportation Safety Administration is investigating the technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use. This suggests that there may have been issues related to the processing unit or algorithms governing the behavior of the Autopilot feature [117484]. (d) network_communication: The article does not provide information indicating any failures related to network communication. (e) embedded_software: The article highlights that Tesla's Autopilot feature has been questioned before in terms of safety. It mentions a fatal crash in 2018 where Autopilot was partly blamed and a more recent incident where Tesla's adaptive cruise control was engaged before a crash. This suggests that there could be issues with the embedded software controlling these autonomous driving features [117484].
Communication unknown The articles do not provide specific information about the failure being related to the communication layer of the cyber physical system that failed.
Application The software failure incident involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles does not seem to be related to the application layer of the cyber physical system. The incident is more focused on the functionality and limitations of the Autopilot feature, driver supervision, and the interaction between the vehicles and their surroundings rather than bugs, operating system errors, unhandled exceptions, or incorrect usage at the application layer [117484].

Other Details

Category Option Rationale
Consequence death, harm (a) death: People lost their lives due to the software failure - The National Highway Transportation Safety Administration reported that one death resulted from the 11 accidents involving Tesla cars using Autopilot or other self-driving features [Article 117484]. (b) harm: People were physically harmed due to the software failure - The NHTSA mentioned that seven of the accidents resulted in 17 injuries [Article 117484].
Domain transportation (a) The failed system in the incident was related to the transportation industry. The software failure incident involved Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles at accident scenes [Article 117484].

Sources

Back to List