Incident: Tesla Autopilot Failure: Model 3 Crashes into Police Car

Published Date: 2019-12-13

Postmortem Analysis
Timeline 1. The software failure incident involving a Tesla Model 3 rear-ending a parked police car in Connecticut happened on December 7, 2019, as reported in Article 93070.
System 1. Tesla's Autopilot driver assistance system [93070]
Responsible Organization 1. The driver of the Tesla Model 3, who was checking on his dog in the back seat while the vehicle was on Autopilot, leading to the rear-ending of a police car and another vehicle [Article 93070].
Impacted Organization 1. The driver of the Tesla Model 3 involved in the crash [93070] 2. The National Highway Traffic Safety Administration (NHTSA) investigating the crash incidents involving Tesla vehicles and Autopilot [93070] 3. Connecticut State Police who were involved in handling the crash incident [93070]
Software Causes 1. The software cause of the failure incident was the engagement of Tesla's Autopilot driver assistance system, which was believed to be involved in the crash [93070].
Non-software Causes 1. The driver of the Tesla was checking on his dog in the back seat prior to the collision, which distracted him from the road [Article 93070]. 2. The Tesla rear-ended a parked police car and then struck a disabled motor vehicle, indicating a failure in maintaining a safe following distance or attentiveness on the part of the driver [Article 93070].
Impacts 1. The Tesla Model 3 rear-ended a parked police car in Connecticut, resulting in damage to both vehicles [Article 93070]. 2. The driver of the Tesla was issued a misdemeanor summons for Reckless Driving and Reckless Endangerment [Article 93070]. 3. No one involved in the crash, including the dog in the back seat of the Tesla, was seriously injured [Article 93070].
Preventions 1. Implementing stricter driver monitoring systems to ensure drivers are attentive and not engaging in distracting activities while Autopilot is active [93070]. 2. Enhancing the Autopilot system to have better object detection capabilities to prevent collisions with stationary objects like parked vehicles [93070]. 3. Conducting more rigorous testing and validation of the Autopilot system to identify and address potential failure scenarios, such as drivers checking on pets in the back seat [93070].
Fixes 1. Implement stricter safety measures and restrictions on the use of the Autopilot feature to prevent drivers from engaging in distracting activities while the system is active, such as checking on pets in the back seat [93070]. 2. Conduct a thorough review and potential redesign of the Autopilot system to enhance its ability to detect and respond to stationary objects, such as parked vehicles, more effectively to avoid rear-end collisions [93070]. 3. Enhance the Autopilot system with improved sensors, algorithms, and artificial intelligence capabilities to ensure better situational awareness and decision-making, especially in complex traffic scenarios [93070].
References 1. National Highway Traffic Safety Administration (NHTSA) [Article 93070] 2. Connecticut State Police [Article 93070]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to Tesla's Autopilot system has happened again within the same organization. The National Highway Traffic Safety Administration (NHTSA) is investigating the 12th Tesla crash that may be tied to the vehicle's Autopilot system [93070]. This incident adds to the previous crashes involving Tesla vehicles where Autopilot was believed to be engaged at the time of the incident, including fatal crashes since 2016. (b) The software failure incident related to Tesla's Autopilot system has also happened at other organizations or with their products and services. The NHTSA has investigated multiple crashes involving Tesla vehicles where Autopilot was believed to be engaged, indicating a recurring issue with advanced driver assistance systems in the automotive industry [93070].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where it mentions that the National Highway Traffic Safety Administration (NHTSA) is investigating a Tesla crash that may be tied to the vehicle's advanced Autopilot driver assistance system [Article 93070]. This indicates a potential failure in the design of the Autopilot system that could have contributed to the crash. (b) The software failure incident related to the operation phase is evident in the article where it describes the Tesla driver checking on his dog in the back seat while the vehicle was in Autopilot mode, leading to a crash with a parked police car [Article 93070]. This highlights a failure in the operation or misuse of the Autopilot system by the driver, which resulted in the collision.
Boundary (Internal/External) within_system (a) within_system: The software failure incident related to the Tesla crash in Connecticut appears to be within the system. The incident occurred while the driver of the Tesla Model 3 was checking on his dog in the back seat, stating that he had his vehicle on "auto-pilot" at the time of the crash [93070]. This indicates that the failure was due to factors originating from within the Tesla Autopilot system itself.
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident in the reported Tesla crash in Connecticut was due to non-human actions. The driver of the Tesla Model 3 stated that he had his vehicle on "auto-pilot" and was checking on his dog in the back seat prior to the collision [93070]. (b) However, human actions also played a role in the incident as the driver of the Tesla was issued a misdemeanor summons for Reckless Driving and Reckless Endangerment by the police [93070].
Dimension (Hardware/Software) software (a) The software failure incident in the reported Tesla crash in Connecticut was not directly attributed to hardware failure. The incident was related to the Tesla's Autopilot driver assistance system, which is a software-based feature. The driver of the Tesla mentioned that he was checking on his dog in the back seat while the vehicle was on "auto-pilot" before the collision occurred [93070]. (b) The software failure incident in the reported Tesla crash in Connecticut was primarily attributed to software failure originating in the Autopilot feature. The driver stated that he had his vehicle on "auto-pilot" and was distracted while the software was engaged, leading to the collision with the parked police car and disabled motor vehicle. This incident highlights a failure in the software's ability to prevent collisions when the driver is not fully attentive or engaged [93070].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident in this case does not appear to be malicious. The incident involved a Tesla Model 3 crashing into a parked police car while the driver was checking on his dog in the back seat with the Autopilot feature engaged. The driver stated that he was using the auto-pilot feature and was not intentionally trying to harm the system or cause the crash [93070]. (b) The software failure incident can be categorized as non-malicious. The crash was a result of the driver's actions and the limitations of the Autopilot system, rather than any intentional malicious activity aimed at causing the failure [93070].
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions The software failure incident related to the Tesla crash in Connecticut on December 7, 2019, involving the Autopilot system can be attributed to both poor decisions and accidental decisions. 1. Poor Decisions: The incident involved the driver of a Tesla Model 3 checking on his dog in the back seat while the vehicle was on Autopilot mode. This action of diverting attention from driving to attend to a pet in the back seat can be considered a poor decision that contributed to the crash [93070]. 2. Accidental Decisions: The driver's action of checking on the dog while the vehicle was in Autopilot mode may have been an unintended decision or mistake, leading to the rear-ending of a police car and a disabled motor vehicle. This accidental decision could also be seen as a contributing factor to the software failure incident [93070].
Capability (Incompetence/Accidental) accidental (a) The software failure incident related to development incompetence is not evident from the provided article. (b) The software failure incident in the article is more aligned with an accidental failure. The incident involved a Tesla Model 3 rear-ending a parked police car while the driver was checking on his dog in the back seat, claiming that the vehicle was on "auto-pilot" at the time of the collision [93070]. This indicates that the failure was accidental and not due to intentional actions or development incompetence.
Duration temporary The software failure incident related to the Tesla crash in Connecticut involving the Autopilot system can be categorized as a temporary failure. The incident occurred when the driver of the Tesla Model 3 was checking on his dog in the back seat while the vehicle was on Autopilot mode. The driver stated that he was not actively controlling the vehicle at the time of the crash, indicating a temporary failure of the Autopilot system to prevent the collision [93070].
Behaviour crash (a) crash: The software failure incident in this case can be categorized as a crash. The Tesla Model 3 rear-ended a parked police car while the driver was checking on his dog in the back seat, with the vehicle on "auto-pilot" mode. This resulted in a collision with the police car and a disabled motor vehicle [Article 93070]. (b) omission: There is no specific mention of the software system omitting to perform its intended functions in this incident. (c) timing: The incident does not involve the system performing its intended functions too late or too early. (d) value: The software failure incident does not involve the system performing its intended functions incorrectly. (e) byzantine: The incident does not exhibit the system behaving erroneously with inconsistent responses and interactions. (f) other: The behavior of the software failure incident in this case can be categorized as a crash due to the system losing state and not performing its intended functions as expected [Article 93070].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence non-human, no_consequence (a) death: The articles mention that Autopilot has been engaged in at least three fatal U.S. Tesla crashes since 2016. However, in the specific incident mentioned in the article, no one was seriously injured, including the dog in the vehicle [93070]. (b) harm: The driver of the Tesla in the incident was issued a misdemeanor summons for Reckless Driving and Reckless Endangerment. However, the articles state that no one involved in the crash was seriously injured [93070]. (g) no_consequence: The articles mention that no one involved in the crash was seriously injured, including the dog in the vehicle [93070].
Domain transportation (a) The failed system in this incident is related to the transportation industry. The software failure incident involved a Tesla Model 3 crashing into a parked police car in Connecticut while the Autopilot feature was engaged [93070]. The National Highway Traffic Safety Administration is investigating this crash, which is the 12th Tesla crash possibly tied to the vehicle's Autopilot system [93070]. The driver of the Tesla stated that he was checking on his dog in the back seat while the vehicle was on "auto-pilot" prior to the collision [93070]. The incident highlights the potential risks and challenges associated with advanced driver assistance systems in the transportation sector.

Sources

Back to List