Incident: Tesla Autopilot System Failure Leads to Third Crash in Two Weeks

Published Date: 2016-07-11

Postmortem Analysis
Timeline 1. The software failure incident involving a Tesla Model X crashing while allegedly under the control of the car's Autopilot system occurred on a Sunday morning near the small town of Whitehall, Montana [45902]. 2. Published on 2016-07-11 07:00:00+00:00. Estimation: - The incident occurred on a Sunday morning near the small town of Whitehall, Montana. - Published on 2016-07-11. - Estimation: The incident likely occurred on a Sunday morning in July 2016.
System 1. Autopilot system on Tesla vehicles [45902]
Responsible Organization 1. The driver of the Tesla Model X was responsible for causing the software failure incident by allegedly using the Autopilot system improperly, not having hands on the steering wheel, and driving on an undivided mountain road instead of a divided highway as recommended by Tesla [45902].
Impacted Organization 1. The driver and passenger of the Tesla Model X involved in the crash near Whitehall, Montana [45902]. 2. Tesla Motors, as the manufacturer of the Autopilot system, was impacted by the incident [45902]. 3. The National Highway Traffic Safety Administration (NHTSA) was impacted as they were investigating the fatal crash involving a Tesla Model S in May [45902]. 4. The US Securities and Exchange Commission (SEC) was impacted as they were investigating whether Tesla failed to notify investors in a timely manner about the fatal crash involving the Autopilot system [45902].
Software Causes 1. Improper use of the Autopilot system by the driver, as indicated by Tesla's statement that the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use [45902].
Non-software Causes 1. Improper use of the Autopilot system by the driver, including not having hands on the steering wheel as required by the system's terms of use [45902].
Impacts 1. The software failure incident involving Tesla's Autopilot system led to a crash in Montana, where the car failed to detect an obstacle on the road, resulting in significant damage to the vehicle [45902]. 2. The incident raised concerns about the proper use of the Autopilot feature, as the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use agreed upon when enabling the feature [45902]. 3. The failure of the Autopilot system to detect a wood stake on the road led to the vehicle hitting more than 20 wood stakes, causing damage to the tire and lights of the car [45902]. 4. The software failure incident highlighted the need for proper road conditions and appropriate use of the Autopilot feature, as the vehicle alerted the driver to put his hands on the wheel when road conditions became uncertain, but the driver did not comply, resulting in a collision [45902].
Preventions 1. Proper driver education and training on the correct usage of the Autopilot system, including the requirement to keep hands on the steering wheel at all times [45902]. 2. Enhanced sensors and algorithms in the Autopilot system to improve obstacle detection and response capabilities, potentially preventing collisions like the one involving the wood stake [45902]. 3. Clearer communication and warnings within the Autopilot system to alert drivers when they are using the feature in inappropriate or unsafe conditions, such as on undivided mountain roads instead of divided highways [45902].
Fixes 1. Implement stricter monitoring mechanisms to ensure drivers are actively engaged and have their hands on the steering wheel when using the Autopilot feature, possibly through enhanced sensors or alerts [45902]. 2. Enhance the Autopilot system's ability to detect obstacles on the road, such as wood stakes, by improving the radar and camera technology used for navigation [45902]. 3. Provide clearer guidelines and restrictions on where and when the Autopilot feature can be used, ensuring it is limited to appropriate road conditions and environments as designed by the manufacturer [45902].
References 1. Tesla Motors Club [45902] 2. Montana Highway Patrol 3. Detroit Free Press 4. Tesla spokesman 5. National Highway Traffic Safety Administration 6. US Securities and Exchange Commission 7. The Wall Street Journal

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to Tesla's Autopilot system has happened again within the same organization. This incident involving a Tesla Model X crashing while allegedly under the control of the car's Autopilot system is the third crash reported in the past two weeks linked to the self-driving feature [Article 45902]. The previous incident mentioned in the article involved a fatal crash of a Tesla Model S in May where the Autopilot system failed to detect a tractor-trailer turning in front of the vehicle, resulting in a fatal accident [Article 45902]. (b) The software failure incident related to self-driving features has also occurred at other organizations. The article mentions a separate incident where a Model X, being driven with its Autopilot feature engaged, flipped over on the Pennsylvania Turnpike after hitting a guard rail on the right side of the road and rebounding to the median barrier. Fortunately, no one was injured in that crash [Article 45902].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase is evident in the Tesla Autopilot crashes. The incident in Montana involved the Autopilot system failing to detect an obstacle on the road, leading to a crash. The friend of the driver mentioned in a Tesla Motors Club forum post that the car hit multiple wood stakes and was completely destroyed because Autopilot did not detect the obstacle on the road [45902]. (b) The software failure incident related to the operation phase is highlighted by Tesla's statement that the driver in the Montana crash was using the Autopilot system improperly. The data suggested that the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use agreed upon when enabling the feature. The driver did not respond to alerts to put his hands on the wheel, leading to the collision with a post on the edge of the roadway [45902].
Boundary (Internal/External) within_system (a) within_system: The software failure incident related to the Tesla vehicle crash with Autopilot engaged can be attributed to factors originating from within the system. The incident occurred because the Autopilot feature failed to detect an obstacle on the road, leading to the vehicle hitting multiple wood stakes and sustaining significant damage [45902]. Additionally, Tesla mentioned that the driver was not using the Autopilot system properly, as the data suggested that the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use agreed upon when enabling the feature [45902].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident in the article is related to non-human actions. The crash involving the Tesla Model X occurred because the Autopilot system failed to detect an obstacle in the road, specifically a wood stake, leading to the vehicle hitting multiple stakes and sustaining significant damage [45902]. (b) The software failure incident can also be attributed to human actions. Tesla mentioned that the driver in the crash was using the Autopilot system improperly by not having their hands on the steering wheel as required by the system's terms of use. The driver did not respond to alerts to put their hands on the wheel, leading to the collision with a post on the edge of the roadway [45902].
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The crash involving the Tesla Model X in Montana was attributed to the car's Autopilot system failing to detect an obstacle on the road, specifically a wood stake. This failure to detect the obstacle led to the vehicle hitting multiple wood stakes and sustaining significant damage [45902]. (b) The software failure incident related to software: - Tesla mentioned that the driver in the crash was using the Autopilot system improperly, as the data suggested that the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged. This failure to comply with the system's terms of use and safety notifications contributed to the crash [45902].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident related to the Tesla vehicle crash while allegedly under the control of the car's Autopilot system does not appear to be malicious. The incident was attributed to the driver allegedly using the Autopilot system improperly, with the driver's hands not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use of the feature [45902]. The crash was a result of the system failing to detect an obstacle on the road and the driver not responding to alerts to put his hands on the wheel, leading to the collision [45902].
Intent (Poor/Accidental Decisions) poor_decisions (a) The software failure incident related to the Tesla vehicle crash while using the Autopilot system can be attributed to poor decisions. The incident involved the driver allegedly not following the terms of use for the Autopilot feature, as the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, contrary to the terms of use agreed upon when enabling the feature [45902]. This failure was exacerbated by the driver using the Autopilot feature on an undivided mountain road, despite it being designed for use on a divided highway in slow-moving traffic.
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident related to development incompetence is evident in the Tesla Autopilot crashes. The incident involving the Model X crash in Montana highlighted that the Autopilot system failed to detect an obstacle on the road, leading to the vehicle hitting multiple wood stakes and sustaining significant damage [45902]. Additionally, Tesla mentioned that the driver was not using the Autopilot feature as intended, with the system alerting the driver to put hands on the wheel as road conditions became uncertain. The driver did not comply, leading to the collision with a post on the edge of the roadway [45902]. (b) The software failure incident related to accidental factors is also present in the Tesla Autopilot crashes. The crash involving the Model X in Montana was described as a single-vehicle crash where neither the driver nor the passenger was injured, indicating that the incident was accidental in nature [45902]. Additionally, the crash involving the Model X on the Pennsylvania Turnpike, where the vehicle flipped over after hitting a guard rail, resulted in no injuries, further suggesting an accidental nature to the incident [45902].
Duration temporary (a) The software failure incident in the article is more likely to be temporary rather than permanent. The incident involved a Tesla vehicle crashing while allegedly under the control of the car's Autopilot system. The article mentions that the driver's hands were not on the steering wheel for over 2 minutes after autosteer was engaged, which is contrary to the terms of use agreed upon when enabling the feature. The system alerted the driver to put his hands on the wheel as road conditions became uncertain, but the driver did not do so, leading to the collision [45902]. This indicates that the failure was temporary and resulted from specific circumstances, such as the driver not following the system's requirements.
Behaviour crash (a) The behavior of the software failure incident in this case can be categorized as a crash. The incident involved a Tesla Model X crashing while allegedly under the control of the car's Autopilot system, resulting in damage to the vehicle. The crash occurred because the Autopilot system failed to detect an obstacle in the road, leading to the vehicle hitting multiple wood stakes and sustaining significant damage [45902].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence death, harm (a) death: People lost their lives due to the software failure - The article mentions a fatal crash of a Tesla Model S in May where the Autopilot system failed to detect a tractor-trailer turning in front of the vehicle, resulting in the death of Joshua Brown, a 40-year-old Tesla enthusiast [45902].
Domain transportation (a) The failed system was intended to support the transportation industry. The software failure incident involved Tesla's Autopilot system, which is a self-driving feature designed to assist in steering the car and matching speeds of slower traffic ahead [45902].

Sources

Back to List