Incident: Tesla Full Self-Driving Software Failure: Child-Size Mannequin Incident

Published Date: 2022-08-09

Postmortem Analysis
Timeline 1. The software failure incident involving a Tesla in full self-driving mode running over a child-size mannequin during a test by The Dawn Project occurred on June 21, as mentioned in Article 131019. Therefore, the software failure incident happened in June 2022.
System 1. Tesla's Autopilot active driver assistance system 2. Tesla's full self-driving software 3. Tesla's over-the-internet software update that allowed vehicles to roll through stop signs without coming to a complete halt 4. Tesla's less sophisticated autopilot driver-assist system 5. Tesla's centre touch screens that allowed playing video games while vehicles are moving 6. Tesla Model S sedans and X SUVs from 2016 to 2022 7. Tesla Model 3 sedans from 2017 to 2022 8. Tesla Model Y SUVs from 2020 to 2022 9. Specific Tesla vehicles used in the beta testing of the full self-driving software 10. Tesla's sensor and camera suite used for Autopilot [Cited from Article 131019]
Responsible Organization 1. The software failure incident involving a Tesla in full self-driving mode running over a child-size mannequin was caused by Tesla's Autopilot active driver assistance system, including the full self-driving software [131019].
Impacted Organization 1. The Dawn Project [131019] 2. Tesla [131019] 3. US National Highway Traffic Safety Administration (NHTSA) [131019]
Software Causes 1. The failure incident involving a Tesla in full self-driving mode running over a child-size mannequin was caused by the failure of the vehicle's software to detect the stationary dummy's presence on the road, leading to multiple collisions at an average speed of 25mph [131019].
Non-software Causes 1. Lack of detection by the Tesla vehicle's sensors of the stationary dummy's presence on the road during the safety test [131019]. 2. Failure of the Tesla vehicle to avoid hitting the child-size mannequin despite being in full self-driving mode [131019]. 3. Tesla's full self-driving software allowing vehicles to roll through stop signs without coming to a complete halt, leading to a recall of nearly 54,000 cars and SUVs [131019].
Impacts 1. The software failure incident involving a Tesla in full self-driving mode running over a child-size mannequin during a safety test by The Dawn Project had deeply disturbing results, as the vehicle failed to detect the stationary dummy's presence in the road and hit it repeatedly at an average speed of 25mph [Article 131019]. 2. The incident led to an open and active investigation by the US National Highway Traffic Safety Administration (NHTSA) into Tesla's Autopilot active driver assistance system, including the full self-driving software [Article 131019]. 3. The failure incident highlighted the dangers of Tesla's full self-driving software, with safety advocates expressing concerns about the risks posed to children and communities by over 100,000 Tesla drivers using the car's full self-driving mode on public roads [Article 131019]. 4. As a consequence of the incident, Tesla recalled nearly 54,000 cars and SUVs due to issues with their full self-driving software allowing them to roll through stop signs without coming to a complete halt, posing a safety risk at junctions [Article 131019]. 5. The failure incident raised questions about the safety of Tesla's software testing practices, with safety advocates criticizing the company for allowing untrained drivers to test vehicles on public roads and highlighting the potential malfunctions of Tesla's software that could endanger other motorists and pedestrians [Article 131019].
Preventions 1. Implementing more rigorous testing procedures for the full self-driving software to ensure accurate detection of objects on the road, such as stationary dummies, to prevent incidents like running over a child-size mannequin [131019]. 2. Enhancing the fail-safe mechanisms in the software to immediately disengage full self-driving mode if the system fails to detect objects or encounters unexpected situations, thereby preventing potential accidents [131019]. 3. Requiring additional training and certification for drivers before allowing them to use the full self-driving mode on public roads to ensure they are prepared to take control of the vehicle in case of software malfunctions or errors [131019].
Fixes 1. Implement stricter regulations and guidelines for testing autonomous driving software on public roads, ensuring that only trained human safety drivers are allowed to operate such vehicles [131019]. 2. Conduct a thorough engineering analysis of Tesla's Autopilot active driver assistance system, including the full self-driving software, to identify and address any potential flaws or issues that may compromise safety [131019]. 3. Enhance the detection capabilities of Tesla's autonomous driving software by improving the sensors, cameras, and radar systems to accurately identify and respond to obstacles, including stationary objects like mannequins on the road [131019]. 4. Update the software to include more robust fail-safe mechanisms that can prevent incidents like the vehicle repeatedly running over a stationary dummy, ensuring that the system can quickly react to unexpected scenarios and avoid collisions [131019].
References 1. The Dawn Project [131019] 2. US National Highway Traffic Safety Administration (NHTSA) [131019] 3. Tesla [131019]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to Tesla's full self-driving software has happened again within the same organization. In February this year, Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. The over-the-internet software update allowed the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour. Tesla agreed to the recall after meetings with officials from NHTSA, and there were no reported crashes or injuries caused by the feature at that time [131019]. (b) The software failure incident related to Tesla's full self-driving software has also happened at other organizations or with their products and services. Safety advocates have raised concerns that Tesla should not be allowed to test the vehicles on public roads with untrained drivers, as the software can malfunction, exposing other motorists and pedestrians to danger. Most car companies with similar software conduct tests with trained human safety drivers. Additionally, the NHTSA is investigating why Teslas on autopilot have repeatedly crashed into emergency vehicles parked on roads, indicating similar incidents involving different organizations or vehicles [131019].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where it is mentioned that Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. This issue was due to an over-the-internet software update that allowed the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour [131019]. (b) The software failure incident related to the operation phase is evident in the article where a Tesla driver complained that the full self-driving software caused a crash. The driver reported that their Model Y went into the wrong lane and was hit by another vehicle, despite receiving an alert halfway through the turn and trying to avoid other traffic. The driver mentioned that the car took control and 'forced itself into the incorrect lane,' leading to the crash [131019].
Boundary (Internal/External) within_system (a) within_system: - The software failure incident related to Tesla's full self-driving mode running over a child-size mannequin during a test by a safety campaign group was within the system. The failure was attributed to the vehicle's failure to detect the stationary dummy's presence in the road despite being in full self-driving mode [131019]. - The failure was also linked to the full self-driving software allowing the vehicle to roll through stop signs without coming to a complete halt, which was a feature of the software itself [131019]. (b) outside_system: - The incident involving the Tesla running over the mannequin during the safety test was not explicitly attributed to factors originating from outside the system in the provided articles.
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: - The incident involved a Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group. The vehicle failed to detect the stationary dummy's presence in the road and hit it repeatedly at an average speed of 25mph [Article 131019]. - The failure was related to the full self-driving software of Tesla, which was being tested on a test track in California under controlled conditions [Article 131019]. - The safety test was carried out at the Willow Springs International Raceway and test track in Rosamond, California, where the incident occurred [Article 131019]. (b) The software failure incident occurring due to human actions: - The incident involved the testing of Tesla's full self-driving software, which was being beta-tested by selected Tesla drivers [Article 131019]. - Safety advocates raised concerns that Tesla should not be allowed to test the vehicles on public roads with untrained drivers, suggesting that human actions in allowing untrained drivers to test the software could contribute to failures [Article 131019]. - The NHTSA was investigating a complaint from a Tesla driver that the full self-driving software caused a crash where the driver reported that the car took control and 'forced itself into the incorrect lane' despite the driver's attempts to avoid other traffic [Article 131019].
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The incident involving Tesla's full self-driving software running over a child-size mannequin during a safety test was primarily related to the failure of the vehicle's hardware components, such as cameras, ultrasonic sensors, and radar, to detect the stationary dummy's presence on the road [131019]. (b) The software failure incident related to software: - The failure of Tesla's full self-driving software to detect the stationary dummy and repeatedly run over it during the safety test at the Willow Springs International Raceway was a result of contributing factors originating in the software itself. The incident highlighted the dangers of Tesla's full self-driving software and raised concerns about its effectiveness and safety [131019].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident related to the Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group can be categorized as non-malicious. The incident occurred during a safety test conducted by The Dawn Project at a test track in California under controlled conditions [131019]. The failure was a result of the vehicle's inability to detect the stationary dummy's presence on the road, leading to the repeated collisions at an average speed of 25mph. The incident was part of a nationwide public safety ad campaign highlighting the dangers of Tesla's full self-driving software [131019]. Additionally, the failure analysis conducted by the US National Highway Traffic Safety Administration (NHTSA) is focused on investigating Tesla's Autopilot active driver assistance system, including the full self-driving software, to consider all relevant data and information that may assist in the investigation [131019]. This indicates a non-malicious intent to understand the root cause of the failure and ensure the safety of the system.
Intent (Poor/Accidental Decisions) poor_decisions (a) The intent of the software failure incident related to poor_decisions: - The failure of Tesla's full self-driving software to detect a stationary dummy on the road and repeatedly hit it during a safety test conducted by The Dawn Project at a test track in California [131019]. - Tesla's recall of nearly 54,000 cars and SUVs due to the full self-driving software allowing vehicles to roll through stop signs without coming to a complete halt, which was found to be a dangerous feature [131019]. (b) The intent of the software failure incident related to accidental_decisions: - The complaint from a Tesla driver that the full self-driving software caused a crash by forcing the car into the wrong lane, resulting in a collision with another vehicle [131019].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident related to development incompetence is evident in the article as it discusses the failure of Tesla's full self-driving software to detect a stationary dummy on the road during a safety test conducted by The Dawn Project [131019]. The incident highlights a lack of professional competence in the development of the software, leading to a potentially dangerous situation where the vehicle repeatedly ran over the mannequin without detecting it. Additionally, the article mentions a previous recall by Tesla due to the full self-driving software allowing vehicles to roll through stop signs without coming to a complete halt, indicating issues with the software development process [131019]. (b) The software failure incident related to accidental factors is also apparent in the article, particularly in the context of unintended consequences of the software's behavior. For example, the article mentions a complaint from a Tesla driver who reported that the full self-driving software caused a crash by taking control and forcing the vehicle into the incorrect lane, resulting in a collision with another vehicle [131019]. This incident illustrates how accidental factors, such as unexpected behavior or malfunctions in the software, can lead to adverse outcomes despite the driver's attempts to intervene.
Duration permanent, temporary The software failure incident related to the Tesla full self-driving software can be categorized as both permanent and temporary. (a) Permanent: The incident involving the Tesla full self-driving software running over a child-size mannequin during a safety test conducted by The Dawn Project can be considered a permanent failure. This is because the failure was due to contributing factors introduced by all circumstances, such as the software's inability to detect the stationary dummy's presence on the road, leading to repeated collisions at an average speed of 25mph [131019]. (b) Temporary: On the other hand, the incident where Tesla recalled nearly 54,000 cars and SUVs due to their full self-driving software allowing them to roll through stop signs without coming to a complete halt can be seen as a temporary failure. This was due to contributing factors introduced by certain circumstances but not all, specifically related to the software update that enabled the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour. Tesla agreed to the recall and made changes to address this issue [131019].
Behaviour crash, omission, value, other (a) crash: The software failure incident related to a crash is described in the article where a Tesla in full self-driving mode was reported to have repeatedly run over a child-sized mannequin during a safety test conducted by The Dawn Project at a test track in California [131019]. (b) omission: The software failure incident related to omission is evident in the article where Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. This omission of stopping at stop signs was a failure of the software to perform its intended function [131019]. (c) timing: The software failure incident related to timing is not explicitly mentioned in the provided article. (d) value: The software failure incident related to value is seen in the article where Tesla's full self-driving software allowed vehicles to go through junctions with a stop sign at up to 5.6 miles per hour without coming to a complete halt. This incorrect performance of the software's intended function led to a recall of the vehicles [131019]. (e) byzantine: The software failure incident related to a byzantine behavior, where the system behaves erroneously with inconsistent responses and interactions, is not explicitly mentioned in the provided article. (f) other: The software failure incident related to other behavior is highlighted in the article where a Tesla driver complained that the full self-driving software caused a crash by taking control and forcing the car into the incorrect lane despite the driver's attempts to avoid other traffic. This behavior of the software causing the car to move into the wrong lane is a unique failure scenario not covered by the other options [131019].

IoT System Layer

Layer Option Rationale
Perception sensor, embedded_software (a) sensor: The failure in the software incident related to the Tesla running over a child-size mannequin during a test in full self-driving mode was due to the sensors' inability to detect the stationary dummy's presence on the road. The incident involved the vehicle hitting the mannequin repeatedly at an average speed of 25mph, indicating a sensor failure in detecting the obstacle [131019]. (b) actuator: The articles do not mention any specific failure related to the actuator in the software incident involving the Tesla running over a child-size mannequin during a test in full self-driving mode. (c) processing_unit: The failure in the software incident was not explicitly attributed to a processing unit error. However, it is mentioned that the powerful onboard computer processes inputs from sensors and cameras to assist in making driving safer and less stressful [131019]. (d) network_communication: The failure in the software incident was not directly linked to network communication errors. The incident primarily focused on the vehicle's failure to detect the obstacle and the potential dangers of Tesla's full self-driving software. (e) embedded_software: The failure in the software incident was related to the embedded software error in Tesla's full self-driving software. The safety campaign group, The Dawn Project, highlighted the dangers of Tesla's full self-driving software and conducted a safety test that resulted in the vehicle repeatedly running over the mannequin. The incident raised concerns about the functionality and safety of Tesla's autonomous driving features [131019].
Communication unknown Unknown
Application FALSE The software failure incident reported in the articles does not specifically mention that the failure was related to the application layer of the cyber physical system due to bugs, operating system errors, unhandled exceptions, or incorrect usage. Therefore, it is unknown if the failure was related to the application layer based on the provided information.

Other Details

Category Option Rationale
Consequence non-human, theoretical_consequence (a) death: People lost their lives due to the software failure - There is no mention of any deaths resulting from the software failure incident reported in the articles. [131019] (b) harm: People were physically harmed due to the software failure - The articles mention that a child-size mannequin was hit over and over again by a Tesla in full self-driving mode during a safety test, but no actual people were physically harmed in the incident. [131019] (c) basic: People's access to food or shelter was impacted because of the software failure - There is no mention of people's access to food or shelter being impacted by the software failure incident. [131019] (d) property: People's material goods, money, or data was impacted due to the software failure - The articles discuss a recall of nearly 54,000 Tesla cars and SUVs due to issues with the full self-driving software, but there is no mention of direct impact on people's material goods, money, or data as a result of the software failure. [131019] (e) delay: People had to postpone an activity due to the software failure - There is no mention of people having to postpone an activity due to the software failure incident. [131019] (f) non-human: Non-human entities were impacted due to the software failure - The incident involved a child-size mannequin being repeatedly hit by a Tesla in full self-driving mode during a safety test, which can be considered as a non-human entity being impacted. [131019] (g) no_consequence: There were no real observed consequences of the software failure - The incident involving the Tesla in full self-driving mode hitting a child-size mannequin during a safety test can be considered as a consequence of the software failure. [131019] (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur - The articles mention potential dangers highlighted by safety advocates regarding the use of Tesla's full self-driving software on public roads with untrained drivers, but these potential consequences did not occur in the reported incident. [131019] (i) other: Was there consequence(s) of the software failure not described in the (a to h) options? What is the other consequence(s)? - There are no other consequences of the software failure incident described in the articles. [131019]
Domain transportation, government The software failure incident reported in the news article is related to the transportation industry. The incident involves Tesla's full self-driving software, which is designed to assist drivers in operating their vehicles autonomously or semi-autonomously. The failure of this system, as highlighted in the article, led to safety concerns and investigations by regulatory authorities like the US National Highway Traffic Safety Administration (NHTSA) [Article 131019]. Additionally, the incident also touches upon the government sector as the NHTSA is actively investigating Tesla's Autopilot active driver assistance system, including the full self-driving software. This demonstrates the intersection of technology and government regulations in ensuring the safety and compliance of autonomous driving systems [Article 131019].

Sources

Back to List