Incident: Tesla Autopilot System Failure in Fatal Crashes.

Published Date: 2016-06-30

Postmortem Analysis
Timeline 1. The software failure incident involving a Tesla in autopilot mode failing to detect a white tractor-trailer turning into its path and resulting in a fatal crash occurred in May 2016 [Article 46608]. 2. The glitch in Tesla's Autopilot technology contributing to the death of Walter Huang in Mountain View, California, happened on March 23, 2018 [Article 95565]. 3. The first fatal autopilot crash involving Tesla's software failing to "see" a white side of a tractor-trailer occurred in 2016 [Article 71225]. 4. The crash that killed a Tesla driver in Florida when the car's autopilot failed to slow down happened in May 2016 [Article 44918]. 5. The crash that killed a Tesla driver in Florida when the car failed to apply the brakes occurred on May 7, 2016 [Article 44675]. 6. The crash in California involving Autopilot was engaged and the driver failed to put his hands back on the steering wheel happened on March 23, 2018 [Article 68946]. 7. The crash that began more than six months ago, where Autopilot failed to keep the driver's vehicle in the lane, occurred before the article was published on an unknown date [Article 112983]. 8. The incident where Tesla's Autopilot technology contributed to a fatal crash in Mountain View, California, happened on March 23, 2018 [Article 95565]. 9. The incident involving a crash with a stopped object due to Autopilot technology failing occurred before the article was published on an unknown date [Article 124207]. 10. The incident involving a Tesla driver striking a motorcycle with Autopilot active occurred in the summer of an unknown year [Article 133550].
System 1. Autopilot feature in Tesla's vehicles [44675, 45773, 46152, 68946, 71225, 95750, 112983, 117658, 122571, 124034, 124207, 125407, 128819, 128859, 128881, 128884, 129597, 129598, 133871]
Responsible Organization 1. The driver and Tesla were responsible for causing the software failure incident [133871]. 2. The limitations in Tesla's Autopilot feature and the driver's distraction likely from a cellphone game application were also factors in causing the incident [95750].
Impacted Organization 1. The driver of the Tesla Model X involved in the crash [Article 69732] 2. The driver of the Tesla Model S in the January crash [Article 69732] 3. Tesla as a company [Article 112771] 4. Tesla's shareholders and the public [Article 122571] 5. Two juveniles involved in a Thanksgiving Day crash on Interstate-80 [Article 136822] 6. The driver of a 2021 Tesla Model S in an eight-vehicle crash on the Bay Bridge [Article 137657]
Software Causes 1. The failure incident was caused by the car's software failing to apply the brakes when a tractor-trailer made a left turn in front of the Tesla due to neither autopilot nor the driver noticing the obstacle against a brightly lit sky [44675, 45773, 45863]. 2. The incident involved a double failure where both the cameras and radar of Tesla's autopilot system struggled to detect and avoid obstacles, leading to the failure to apply the brakes [45773]. 3. The Autopilot system failed to recognize a white truck crossing a rural highway due to the camera being confused by the truck appearing against a bright sky, leading to the crash [68946]. 4. The crash was caused by system limitations in Tesla's Autopilot feature and the driver's distraction likely from a cellphone game application, leading to a lack of response to the situation [95750]. 5. The failure incident was due to Autopilot failing to keep the driver's vehicle in the lane and its collision-avoidance software failing to detect a highway barrier, along with the driver's distraction from a game on his phone [112983].
Non-software Causes 1. Driver error, as the driver may have been watching a Harry Potter DVD [Article 45773] 2. Lack of response from the driver due to distraction likely from a cellphone game application and overreliance on the Autopilot system [Article 95750] 3. Deficiencies in road maintenance and markings by California officials leading to confusion for the Autopilot software [Article 95565]
Impacts 1. The software failure incident led to a fatal crash in California where the driver's vehicle collided with a concrete divider, resulting in the driver's death and injuries to others involved [Article 68946]. 2. The software failure incident caused a crash in Utah where the vehicle collided with a stopped object, highlighting a failure of the technology to prevent the collision [Article 71225]. 3. The software failure incident resulted in a crash on Interstate-80 near Treasure Island, leading to two juveniles being transported to a hospital for minor injuries and causing lengthy delays on the bridge [Article 136822]. 4. The software failure incident led to an eight-car pile-up on San Francisco's Bay Bridge, resulting in injuries to multiple individuals, including two people being rushed to the hospital [Article 137525].
Preventions 1. Implementing a new radar-based braking system could have potentially prevented the crash [Article 95750]. 2. Incorporating a computer chip and hardware in the system to ensure backups were available in case of Autopilot malfunction could have helped prevent the incident [Article 122571]. 3. Enhancing the software to better handle sensor fusion and conflicting information from multiple sensors could have prevented the failure [Article 45773]. 4. Improving the lane markings and road maintenance in the area where the crash occurred could have prevented confusion for the Autopilot software [Article 95565].
Fixes 1. Implementing software updates to improve Autopilot's ability to recognize emergency vehicles [Article 122562] 2. Conducting a recall to address issues with the Full Self-Driving Beta software [Article 124207] 3. Providing detailed information on software updates and potential defects to regulatory authorities like NHTSA [Article 119510] 4. Ensuring that the software is regularly updated and refined to address glitches and limitations [Article 113022] 5. Investigating and addressing system failures related to object and event detection by the software [Article 117658]
References The articles gather information about the software failure incident from the following specific entities: 1. Tesla [45773, 47057, 69732, 69743, 71225, 84474, 112983, 124207, 128881] 2. National Highway Traffic Safety Administration (NHTSA) [68946, 71225, 84474, 117658, 128819, 128884, 129597] 3. University of South Carolina [47057] 4. Zhejiang University [47057] 5. Qihoo 360 [47057] 6. Apple [69732, 95565, 112983] 7. Defcon hacker conference [47057] 8. CNET [69743] 9. California Transportation Department [112983] 10. NTSB (National Transportation Safety Board) [84474, 112983] 11. U.S. Senators [129597] 12. U.S. safety board [129597] [Note: The list includes entities mentioned in the articles that are relevant to the software failure incident.]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident having happened again at one_organization: - Tesla has faced similar incidents before, such as a fatal crash in Florida where the driver's over-reliance on vehicle automation was determined as a probable cause [Article 71593]. - Tesla's business has been facing challenges, including a recent crash and Model 3 production delays, leading to a recall of 123,000 Model S cars due to a faulty steering bolt [Article 69743]. (b) The software failure incident having happened again at multiple_organization: - The article mentions that the scenario of the incident is one of two being considered by the company, indicating that similar incidents may have occurred with other organizations or their products and services [Article 46152]. - The article discusses the potential for cyber attacks affecting a number of vehicles at the same time, suggesting that this type of incident may not be limited to a single organization [Article 52212].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase is evident in the article discussing how Tesla's software refinements may bypass regulatory scrutiny, raising questions about whether defects are being properly addressed through recalls [Article 119510]. Additionally, the article mentions that Tesla's Full Self-Driving software had issues like running stop signs, slow windshield heating systems, and seat belt chimes not sounding, which were to be fixed with online software updates [Article 124034]. (b) The software failure incident related to the operation phase is highlighted in the article discussing driver behavior factors leading to crashes, including mode confusion during Autopilot activations and inadvertent overrides [Article 95750]. It also mentions that in about a quarter of crashes, the primary factor appeared to be related to operating the system where limitations may exist, such as in visibility environments involving factors like rain, snow, or ice [Article 129598].
Boundary (Internal/External) within_system, outside_system (a) within_system: - The NTSB found that the crash involving Tesla's Autopilot feature was caused by "system limitations" and the driver's lack of response due to distraction, likely from a cellphone game application and overreliance on the Autopilot system [Article 95750]. - The NTSB determined that Autopilot failed to keep the driver's vehicle in the lane, its collision-avoidance software failed to detect a highway barrier, and the lack of sufficient system controls resulted in a fatal collision [Article 112983]. - The NTSB order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash to determine if there were defects within the system posing an unreasonable risk to safety [Article 128884]. (b) outside_system: - The investigation into a tragic incident involving a Tesla vehicle considered whether the crash was caused by a software glitch or human error, with reports suggesting the driver may have been watching a Harry Potter DVD, indicating a potential external distraction factor [Article 45773]. - The NTSB blamed Tesla, drivers, and lax regulation by NHTSA for collisions involving Teslas crashing beneath crossing tractor-trailers, highlighting external factors contributing to the crashes [Article 117658]. - The NTSB emphasized that crashes are tracked differently by automakers and discouraged comparisons due to the lack of comprehensive metrics on how widely each system is used, indicating external variability in reporting and tracking practices [Article 129597].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: - Article 47057 mentions a group that was able to convince Autopilot that an object existed when it did not, potentially leading to incorrect driving decisions. - Article 124899 discusses how automatic braking features can cause rear-end collisions if the system stops the car unnecessarily when falsely detecting an obstacle. - Article 125407 highlights issues with Tesla's Autopilot software, which has been implicated in accidents due to the cars being unable to "see" parked objects. (b) The software failure incident occurring due to human actions: - Article 44675 and Article 45773 mention incidents where neither the autopilot nor the driver noticed obstacles, leading to accidents. - Article 68946 and Article 69732 discuss cases where drivers failed to respond to warnings and keep their hands on the steering wheel, resulting in crashes. - Article 95750 mentions that the crash was caused by "system limitations" in Tesla's Autopilot feature and the driver's lack of response due to distraction likely from a cellphone game application and overreliance on the Autopilot system.
Dimension (Hardware/Software) hardware, software (a) The software failure incident occurring due to hardware: - Article 114906 mentions that the fire in the vehicle originated from the vehicle's power distribution system and related components located at the front end of the vehicle. - Article 114907 also states that the fire within the vehicle was a result of damage to the battery, power distribution systems, or systems associated with battery cell temperature regulation. (b) The software failure incident occurring due to software: - Article 95750 attributes the crash to "system limitations" in Tesla’s Autopilot feature and the driver's lack of response due to distraction and overreliance on the Autopilot system. - Article 125407 discusses the challenges faced by the AV industry, including the difficulty in handling edge cases and the need for software to mature fast enough to gain trust and regulatory approval.
Objective (Malicious/Non-malicious) malicious, non-malicious (a) In the context of a malicious software failure incident, researchers were able to purposefully confuse Autopilot's sensors by using off-the-shelf products to carry out a hack, potentially putting both passengers and the general public in danger [Article 47057]. (b) In the context of a non-malicious software failure incident, the investigation into a tragic incident involving a Tesla vehicle concluded that it was unclear whether the incident was caused by a software glitch or human error. The incident involved the driver not noticing a white side of a tractor-trailer against a brightly lit sky, leading to a fatal outcome. The investigation highlighted the vulnerability of driverless vehicles to human factors and the limitations of semi-autonomous systems [Article 45773].
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident related to poor_decisions: - Article 127766 mentions that certain decisions related to the use of cameras instead of lidar were made somewhat arbitrarily without a deep research phase, which some team members disagreed with [127766]. - Article 124207 discusses the fundamental challenge with Tesla's decision to release software that requires regular intervention by humans, leading to a struggle for control between the car's system and the human driver in certain situations [124207]. (b) The intent of the software failure incident related to accidental_decisions: - Article 44675, 45773, and 45863 highlight the incident where the car failed to apply brakes due to neither the autopilot nor the driver noticing the white side of the tractor-trailer against a brightly lit sky, leading to a fatal accident [44675, 45773, 45863]. - Article 95750 mentions that the crash was caused by system limitations in Tesla's Autopilot feature and the driver's lack of response due to distraction, indicating an unintended decision leading to the failure [95750].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident occurring due to development incompetence: - The incident involving the fatal Tesla crash highlighted issues with sensor fusion, where multiple sensors giving conflicting information led to misunderstandings similar to those between human drivers [Article 45773]. - The NTSB board found that the crash was caused by "system limitations" in Tesla's Autopilot feature and the driver's lack of response due to distraction, indicating a failure in the system controls and human error [Article 95750]. - The safety board determined that while Autopilot worked as intended, it had nonetheless played a major role in the crash due to the combined effects of human error and lack of sufficient system controls [Article 112983]. (b) The software failure incident occurring accidentally: - The incident involving the Tesla crash raised concerns about accidental failures in self-driving technology, such as the system misinterpreting a truck as a road sign overhead due to sensor fusion issues [Article 45773]. - The article discussing hacking vulnerabilities in driverless vehicles highlighted how cars become more vulnerable to hacking as they become more hi-tech, indicating potential accidental failures due to increased vulnerabilities [Article 45773]. - The article mentioned incidents where automatic braking features caused rear-end collisions due to falsely detecting obstacles, indicating accidental problems with the system [Article 124899].
Duration permanent, temporary (a) The articles provide information about software failures that can be considered permanent due to contributing factors introduced by all circumstances. For example, in Article 95565, deficiencies in Tesla's software were mentioned as contributing to a fatal crash. Additionally, in Article 95750, the crash was attributed to "system limitations" in Tesla’s Autopilot feature. These incidents highlight software failures that are not temporary but rather permanent due to underlying issues in the software itself. (b) On the other hand, temporary software failures due to contributing factors introduced by certain circumstances but not all are also evident in the articles. For instance, in Article 84912, it is mentioned that Tesla reduced the time the driver can go without touching the wheel before the system issues warnings, indicating a temporary measure to address a specific issue. Furthermore, in Article 123946, concerns were raised about rapid deceleration occurring without warning, suggesting a temporary failure that occurs under specific conditions. Therefore, the articles provide examples of both permanent and temporary software failure incidents.
Behaviour crash, omission, timing, value, other (a) crash: Failure due to system losing state and not performing any of its intended functions - Article 95750: The NTSB found that the fatal crash was caused by "system limitations" in Tesla's Autopilot feature, leading to the car veering off the road due to limitations in the Autopilot vision system's processing software [95750]. - Article 128881: The NHTSA looked at crashes where the main cause was running Autopilot in areas with limitations or conditions that interfere with its operation, indicating a failure of the system to function properly [128881]. (b) omission: Failure due to system omitting to perform its intended functions at an instance(s) - Article 72381: Advanced driver assistance system failures can result in collisions when the system is locked onto a car in front, fails to recognize a stopped vehicle ahead, and may be too late to stop [72381]. - Article 84912: The circumstances of a crash indicated that while driving at highway speeds, the Autopilot system remained incapable of detecting some stationary objects or those moving perpendicular to the car [84912]. (c) timing: Failure due to system performing its intended functions correctly, but too late or too early - Article 84912: Tesla reduced the time the driver can go without touching the wheel before the system issues warnings, indicating a timing issue in driver engagement with the system [84912]. (d) value: Failure due to system performing its intended functions incorrectly - Article 95724: The NTSB mentioned that Tesla's Autopilot system did not provide an effective means of monitoring the driver's engagement, implying a failure in the system's intended function [95724]. (e) byzantine: Failure due to system behaving erroneously with inconsistent responses and interactions - Article 124207: The automated driving system was not able to detect or cope with relevant features of its Operational Design Domain, leading to a struggle for control between the system and the human driver [124207]. (f) other: Failure due to system behaving in a way not described in the (a to e) options - Article 45773: The fatal Tesla crash was attributed to sensor fusion issues, where multiple sensors gave conflicting information, leading to the misinterpretation of a truck as a road sign overhead [45773]. - Article 125407: The article discusses various issues with Tesla's Full Self-Driving software, including instances where the technology engaged in non-human behavior, such as getting dangerously close to pedestrians and failing to register obstacles like bollards [125407].

IoT System Layer

Layer Option Rationale
Perception sensor, embedded_software (a) sensor: The failure was related to the perception layer of the cyber physical system that failed due to contributing factors introduced by sensor error. Researchers demonstrated how sensors could be purposefully confused to deceive the Autopilot system [46608]. They used off-the-shelf products to carry out hacks that confused Autopilot's sensors [47057]. Additionally, the article highlighted how adverse weather conditions can create visibility problems for sensors powering driverless technology [45773]. (b) actuator: There is no specific mention of the failure being related to actuator error in the provided articles. (c) processing_unit: There is no specific mention of the failure being related to processing error in the provided articles. (d) network_communication: There is no specific mention of the failure being related to network communication error in the provided articles. (e) embedded_software: The failure was related to the perception layer of the cyber physical system that failed due to contributing factors introduced by embedded software error. The incident involving a Tesla in autopilot mode failing to detect a white tractor-trailer was a clear example of the reliability of autonomous vehicles being questioned, raising concerns about the software's ability to interpret sensor data accurately [46608]. The article also mentioned how the Autopilot system failed to keep the driver's vehicle in the lane and its collision-avoidance software failed to detect a highway barrier, indicating a failure in the embedded software [112983].
Communication unknown The failure related to the communication layer of the cyber physical system that failed is not explicitly mentioned in the provided articles.
Application TRUE The failure related to the application layer of the cyber physical system that failed can be attributed to the following information from the articles: 1. Article 95750 mentions that the crash involving Tesla's Autopilot feature was caused by "system limitations" and the driver's distraction likely from a cellphone game application, indicating a failure related to the application layer of the system. 2. Article 112983 discusses how Autopilot failed to keep the driver's vehicle in the lane and its collision-avoidance software failed to detect a highway barrier, leading to a fatal collision. This points towards a failure at the application layer of the system. 3. Article 129598 highlights that in about a quarter of the crashes investigated, the primary crash factor appeared to relate to operating the system where limitations may exist, indicating a failure related to the application layer of the system. Therefore, based on the information from the articles, it can be inferred that the software failure incidents were related to the application layer of the cyber physical system that failed [95750, 112983, 129598].

Other Details

Category Option Rationale
Consequence death, harm, property, non-human, theoretical_consequence (a) death: The software failure incident resulted in fatalities. Article 95565 mentions that deficiencies in Tesla's software contributed to a person's death. Additionally, Article 112983 states that the driver would have survived if the software had functioned correctly, and Article 122562 mentions that there were deaths in crashes involving cars operating using Autopilot. (b) harm: People were physically harmed due to the software failure. Article 122562 mentions that there were injuries and deaths in crashes involving cars operating using Autopilot. (d) property: People's material goods, money, or data were impacted due to the software failure. Article 122562 mentions that there were crashes resulting in property damage. (e) delay: People had to postpone an activity due to the software failure. There is no specific mention of delays caused by the software failure in the provided articles. (f) non-human: Non-human entities were impacted due to the software failure. Article 124207 discusses a crash involving Tesla's software that revealed flaws in the software's ability to recognize and respond to objects correctly. (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur. Articles 45773, 46608, 52212, 125407, and 128884 discuss potential consequences such as cyber attacks, security risks, and the need to improve sensor reliability. (i) other: The articles do not mention any other specific consequences of the software failure.
Domain transportation, finance (a) The failed system was related to the transportation industry, specifically in the context of semi-autonomous vehicles and driver assistance technology [45773, 95574, 111927, 113806, 119510, 124207, 125407, 128881, 128884, 129597, 129598]. (h) The failed system incident also involved a whistleblower complaint related to Tesla's solar systems, which could be linked to the finance industry due to the financial implications and regulatory investigations [122571, 133685].

Sources

Back to List