Recurring |
one_organization, multiple_organization |
(a) The software failure incident related to Tesla's full self-driving software has happened again within the same organization. In February this year, Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. The over-the-internet software update allowed the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour. Tesla agreed to the recall after meetings with officials from NHTSA, and there were no reported crashes or injuries caused by the feature at that time [131019].
(b) The software failure incident related to Tesla's full self-driving software has also happened at other organizations or with their products and services. Safety advocates have raised concerns that Tesla should not be allowed to test the vehicles on public roads with untrained drivers, as the software can malfunction, exposing other motorists and pedestrians to danger. Most car companies with similar software conduct tests with trained human safety drivers. Additionally, the NHTSA is investigating why Teslas on autopilot have repeatedly crashed into emergency vehicles parked on roads, indicating similar incidents involving different organizations or vehicles [131019]. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase can be seen in the article where it is mentioned that Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. This issue was due to an over-the-internet software update that allowed the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour [131019].
(b) The software failure incident related to the operation phase is evident in the article where a Tesla driver complained that the full self-driving software caused a crash. The driver reported that their Model Y went into the wrong lane and was hit by another vehicle, despite receiving an alert halfway through the turn and trying to avoid other traffic. The driver mentioned that the car took control and 'forced itself into the incorrect lane,' leading to the crash [131019]. |
Boundary (Internal/External) |
within_system |
(a) within_system:
- The software failure incident related to Tesla's full self-driving mode running over a child-size mannequin during a test by a safety campaign group was within the system. The failure was attributed to the vehicle's failure to detect the stationary dummy's presence in the road despite being in full self-driving mode [131019].
- The failure was also linked to the full self-driving software allowing the vehicle to roll through stop signs without coming to a complete halt, which was a feature of the software itself [131019].
(b) outside_system:
- The incident involving the Tesla running over the mannequin during the safety test was not explicitly attributed to factors originating from outside the system in the provided articles. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident occurring due to non-human actions:
- The incident involved a Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group. The vehicle failed to detect the stationary dummy's presence in the road and hit it repeatedly at an average speed of 25mph [Article 131019].
- The failure was related to the full self-driving software of Tesla, which was being tested on a test track in California under controlled conditions [Article 131019].
- The safety test was carried out at the Willow Springs International Raceway and test track in Rosamond, California, where the incident occurred [Article 131019].
(b) The software failure incident occurring due to human actions:
- The incident involved the testing of Tesla's full self-driving software, which was being beta-tested by selected Tesla drivers [Article 131019].
- Safety advocates raised concerns that Tesla should not be allowed to test the vehicles on public roads with untrained drivers, suggesting that human actions in allowing untrained drivers to test the software could contribute to failures [Article 131019].
- The NHTSA was investigating a complaint from a Tesla driver that the full self-driving software caused a crash where the driver reported that the car took control and 'forced itself into the incorrect lane' despite the driver's attempts to avoid other traffic [Article 131019]. |
Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident related to hardware:
- The incident involving Tesla's full self-driving software running over a child-size mannequin during a safety test was primarily related to the failure of the vehicle's hardware components, such as cameras, ultrasonic sensors, and radar, to detect the stationary dummy's presence on the road [131019].
(b) The software failure incident related to software:
- The failure of Tesla's full self-driving software to detect the stationary dummy and repeatedly run over it during the safety test at the Willow Springs International Raceway was a result of contributing factors originating in the software itself. The incident highlighted the dangers of Tesla's full self-driving software and raised concerns about its effectiveness and safety [131019]. |
Objective (Malicious/Non-malicious) |
non-malicious |
(a) The software failure incident related to the Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group can be categorized as non-malicious. The incident occurred during a safety test conducted by The Dawn Project at a test track in California under controlled conditions [131019]. The failure was a result of the vehicle's inability to detect the stationary dummy's presence on the road, leading to the repeated collisions at an average speed of 25mph. The incident was part of a nationwide public safety ad campaign highlighting the dangers of Tesla's full self-driving software [131019].
Additionally, the failure analysis conducted by the US National Highway Traffic Safety Administration (NHTSA) is focused on investigating Tesla's Autopilot active driver assistance system, including the full self-driving software, to consider all relevant data and information that may assist in the investigation [131019]. This indicates a non-malicious intent to understand the root cause of the failure and ensure the safety of the system. |
Intent (Poor/Accidental Decisions) |
poor_decisions |
(a) The intent of the software failure incident related to poor_decisions:
- The failure of Tesla's full self-driving software to detect a stationary dummy on the road and repeatedly hit it during a safety test conducted by The Dawn Project at a test track in California [131019].
- Tesla's recall of nearly 54,000 cars and SUVs due to the full self-driving software allowing vehicles to roll through stop signs without coming to a complete halt, which was found to be a dangerous feature [131019].
(b) The intent of the software failure incident related to accidental_decisions:
- The complaint from a Tesla driver that the full self-driving software caused a crash by forcing the car into the wrong lane, resulting in a collision with another vehicle [131019]. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident related to development incompetence is evident in the article as it discusses the failure of Tesla's full self-driving software to detect a stationary dummy on the road during a safety test conducted by The Dawn Project [131019]. The incident highlights a lack of professional competence in the development of the software, leading to a potentially dangerous situation where the vehicle repeatedly ran over the mannequin without detecting it. Additionally, the article mentions a previous recall by Tesla due to the full self-driving software allowing vehicles to roll through stop signs without coming to a complete halt, indicating issues with the software development process [131019].
(b) The software failure incident related to accidental factors is also apparent in the article, particularly in the context of unintended consequences of the software's behavior. For example, the article mentions a complaint from a Tesla driver who reported that the full self-driving software caused a crash by taking control and forcing the vehicle into the incorrect lane, resulting in a collision with another vehicle [131019]. This incident illustrates how accidental factors, such as unexpected behavior or malfunctions in the software, can lead to adverse outcomes despite the driver's attempts to intervene. |
Duration |
permanent, temporary |
The software failure incident related to the Tesla full self-driving software can be categorized as both permanent and temporary.
(a) Permanent: The incident involving the Tesla full self-driving software running over a child-size mannequin during a safety test conducted by The Dawn Project can be considered a permanent failure. This is because the failure was due to contributing factors introduced by all circumstances, such as the software's inability to detect the stationary dummy's presence on the road, leading to repeated collisions at an average speed of 25mph [131019].
(b) Temporary: On the other hand, the incident where Tesla recalled nearly 54,000 cars and SUVs due to their full self-driving software allowing them to roll through stop signs without coming to a complete halt can be seen as a temporary failure. This was due to contributing factors introduced by certain circumstances but not all, specifically related to the software update that enabled the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour. Tesla agreed to the recall and made changes to address this issue [131019]. |
Behaviour |
crash, omission, value, other |
(a) crash: The software failure incident related to a crash is described in the article where a Tesla in full self-driving mode was reported to have repeatedly run over a child-sized mannequin during a safety test conducted by The Dawn Project at a test track in California [131019].
(b) omission: The software failure incident related to omission is evident in the article where Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt. This omission of stopping at stop signs was a failure of the software to perform its intended function [131019].
(c) timing: The software failure incident related to timing is not explicitly mentioned in the provided article.
(d) value: The software failure incident related to value is seen in the article where Tesla's full self-driving software allowed vehicles to go through junctions with a stop sign at up to 5.6 miles per hour without coming to a complete halt. This incorrect performance of the software's intended function led to a recall of the vehicles [131019].
(e) byzantine: The software failure incident related to a byzantine behavior, where the system behaves erroneously with inconsistent responses and interactions, is not explicitly mentioned in the provided article.
(f) other: The software failure incident related to other behavior is highlighted in the article where a Tesla driver complained that the full self-driving software caused a crash by taking control and forcing the car into the incorrect lane despite the driver's attempts to avoid other traffic. This behavior of the software causing the car to move into the wrong lane is a unique failure scenario not covered by the other options [131019]. |