Incident: Tesla Autopilot System Misreads Speed Limit Sign Causing Acceleration

Published Date: 2020-02-19

Postmortem Analysis
Timeline 1. The software failure incident involving a Tesla vehicle being tricked into accelerating over the speed limit with a strip of tape happened in March 2018 [95680].
System 1. Tesla's MobilEye camera system in the 2016 Model X and Model S [95680]
Responsible Organization 1. Researchers at McAfee [95680]
Impacted Organization 1. Drivers of Tesla vehicles, as the software failure incident led to unintended acceleration causing crashes and injuries [95680]. 2. National Highway Safety Administration (NHTSA), as they received complaints about the faulty technology and are investigating Tesla [95680].
Software Causes 1. The software cause of the failure incident was a flaw in the machine learning systems used in automated driving, specifically related to the misclassification of speed limit signs by the MobilEye camera on Tesla vehicles [95680].
Non-software Causes 1. Human interference by placing a two-inch strip of electrical tape on a speed limit sign, causing the car's camera system to misread the speed limit [95680]. 2. Misclassification of the MobilEye camera on a Tesla due to the altered sign, leading to autonomous speeding up of the vehicle [95680]. 3. Design limitation of the cameras in the specific Tesla models not being intended for fully autonomous driving [95680].
Impacts 1. The software failure incident involving Tesla vehicles being tricked into accelerating over the speed limit due to a modified speed limit sign caused 110 crashes and 52 injuries, with many drivers experiencing sudden unintended acceleration when parking or approaching curbs [95680]. 2. The incident led to complaints to the National Highway Safety Administration (NHTSA) and an investigation into Tesla's technology for accelerating over the speed limit, which has caused accidents and even deaths [95680]. 3. The faulty technology in Tesla vehicles resulted in incidents where vehicles suddenly accelerated on their own, causing collisions with parked cars, chain link fences, and garage doors, leading to property damage and potential safety risks [95680]. 4. The software failure incident raised concerns about the weaknesses of machine learning systems used in automated driving, highlighting the potential risks associated with misclassifications and vulnerabilities in autonomous driving technologies [95680].
Preventions 1. Implementing additional verification mechanisms in the camera system to prevent misinterpretation of altered signs like the one used in the experiment by McAfee [95680]. 2. Conducting thorough testing and validation of the machine learning algorithms used in automated driving systems to detect and address vulnerabilities that could lead to unintended acceleration incidents [95680]. 3. Enhancing communication and collaboration between researchers like McAfee and companies like Tesla and MobilEye to promptly address and resolve identified weaknesses in the software systems [95680].
Fixes 1. Implement a software update to improve the camera system's ability to accurately read speed limit signs and prevent misclassifications like the one caused by the electrical tape [95680]. 2. Enhance the machine learning algorithms used in automated driving systems to better differentiate between legitimate and altered road signs to avoid autonomous acceleration based on false readings [95680]. 3. Conduct thorough testing and validation of the software to ensure that it can distinguish between intentional driver commands and erroneous inputs from external factors like altered signs [95680]. 4. Collaborate with researchers and experts in the field of autonomous driving technology to address vulnerabilities and weaknesses in the system that could lead to unintended acceleration incidents [95680].
References 1. McAfee researchers [95680] 2. National Highway Safety Administration (NHTSA) [95680] 3. Tesla [95680] 4. MobilEye EyeQ3 [95680] 5. MIT Tech Review [95680]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to Tesla vehicles accelerating over the speed limit has happened again within the same organization. In the past, there have been incidents where Tesla vehicles experienced sudden unintended acceleration issues, leading to crashes and injuries [95680]. (b) The software failure incident related to Tesla vehicles accelerating over the speed limit has also occurred at other organizations or with their products and services. The National Highway Safety Administration (NHTSA) received complaints about faulty technology in Tesla vehicles causing them to accelerate over the speed limit, resulting in accidents and even deaths. This has led to investigations into Tesla's technology by regulatory authorities [95680].
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where researchers at McAfee placed a two-inch long piece of electrical tape on a speed limit sign, causing Tesla's camera system to misread the speed limit as 85 mph instead of 35 mph. This design flaw in the system's camera interpretation led to the car automatically accelerating to 50 mph when approaching the altered sign [95680]. (b) The software failure incident related to the operation phase is evident in the complaints received by the National Highway Safety Administration (NHTSA) about Tesla vehicles experiencing sudden unintended acceleration. Drivers reported incidents where the vehicles accelerated on their own while parking, approaching a garage, or even in traffic, indicating failures in the operation or behavior of the system [95680].
Boundary (Internal/External) within_system, outside_system (a) within_system: The software failure incident reported in the articles is primarily within the system. The incident involved a flaw in the machine learning systems used in automated driving in Tesla vehicles. Researchers at McAfee were able to trick the Tesla vehicles into accelerating over the speed limit by placing a piece of electrical tape on a speed limit sign, causing the car's camera system to misread the speed limit sign [95680]. (b) outside_system: The incident also involved contributing factors that originated from outside the system. The alteration of the speed limit sign by placing a piece of electrical tape was a manipulation done externally to the system, which led to the misclassification by the camera system and subsequent acceleration of the Tesla vehicles. This external manipulation highlights a vulnerability in the system's ability to accurately interpret its environment [95680].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: The incident where a Tesla vehicle was tricked into accelerating over the speed limit was caused by a flaw in the camera system's interpretation of a modified speed limit sign. Researchers at McAfee placed a two-inch long piece of electrical tape on a speed limit sign, causing the car's camera system to misread the speed limit as 85 mph instead of 35 mph. This non-human action of altering the sign led to the car autonomously accelerating [95680]. (b) The software failure incident occurring due to human actions: The experiment conducted by McAfee involved human interference by placing the electrical tape on the speed limit sign to deceive the car's camera system. The researchers intentionally modified the sign to trigger the misclassification by the camera system, leading to the unintended acceleration of the Tesla vehicles. This human action of manipulating the sign demonstrated a weakness in the machine learning systems used in automated driving [95680].
Dimension (Hardware/Software) hardware, software (a) The software failure incident occurring due to hardware: - The incident involving Tesla vehicles accelerating over the speed limit was caused by a physical alteration to a speed limit sign using a two-inch strip of electrical tape, which led to the car's camera system misreading the speed limit sign [95680]. - The altered sign caused the MobilEye camera on the Tesla vehicles to misclassify the speed limit, resulting in the vehicles autonomously speeding up to 85 mph when reading a 35 mph sign [95680]. - The hardware component involved in this incident was the camera system in the Tesla vehicles, specifically the MobilEye EyeQ3 system, which was responsible for misinterpreting the altered speed limit sign [95680]. (b) The software failure incident occurring due to software: - The software failure in this incident was related to the machine learning systems used in automated driving, specifically the misclassification of the speed limit sign by the camera system [95680]. - The incident highlighted a weakness in the machine learning algorithms used in the Tesla vehicles, as the misclassification of the altered sign led to the vehicles accelerating beyond the speed limit [95680]. - The software aspect of this incident involved the interaction between the camera system software and the machine learning algorithms that determined the vehicle's response to the perceived speed limit [95680].
Objective (Malicious/Non-malicious) malicious (a) The software failure incident described in the articles can be categorized as malicious. Researchers at McAfee intentionally placed a two-inch long piece of electrical tape on a speed limit sign to trick Tesla vehicles into misreading the speed limit and accelerating beyond the limit [95680]. This action was done with the intent to demonstrate a weakness in the machine learning systems used in automated driving, highlighting a potential vulnerability that could be exploited by malicious actors.
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident: - The software failure incident involving Tesla vehicles being tricked into accelerating over the speed limit with a strip of tape can be categorized under 'poor_decisions' as it was a result of a flaw in the machine learning systems used in automated driving. Researchers at McAfee were able to cause a targeted misclassification of the MobilEye camera on a Tesla by making a tiny sticker-based modification to a speed limit sign, leading to the vehicle autonomously speeding up to 85 mph when reading a 35 mph sign [95680]. Tesla and MobilEye were made aware of this issue, but Tesla did not respond to the research and stated it would not be fixing the problems uncovered by the McAfee researchers [95680]. (b) The incident can also be seen as involving 'accidental_decisions' as MobilEye EyeQ3, the company providing the camera systems for the Tesla models, dismissed the research and mentioned that the altered sign could have easily been misread by a human. They also stated that the cameras in those specific Tesla models are not designed for fully autonomous driving [95680]. This dismissal and downplaying of the issue by MobilEye could be seen as an accidental decision that contributed to the failure incident.
Capability (Incompetence/Accidental) development_incompetence (a) The software failure incident occurring due to development incompetence: - The incident where a Tesla vehicle was tricked into accelerating over the speed limit with a strip of tape was caused by a flaw in the machine learning systems used in automated driving [95680]. - Researchers at McAfee were able to cause the Tesla to autonomously speed up to 85 mph by making a tiny sticker-based modification to a speed limit sign, highlighting a weakness in the system [95680]. (b) The software failure incident occurring accidentally: - The incident where the Tesla vehicles accelerated over the speed limit due to misreading a modified speed limit sign was a result of accidental human interference by placing a piece of electrical tape on the sign, causing the car's camera system to misread the speed limit [95680].
Duration permanent (a) The software failure incident described in the articles appears to be more of a permanent nature. The incident involved a flaw in the Tesla vehicles' camera system that misread a speed limit sign, causing the vehicles to autonomously accelerate to a higher speed than intended. This flaw led to multiple crashes and injuries, with drivers experiencing sudden unintended acceleration even when attempting to park or navigate in traffic [95680]. The incident was not a one-time occurrence but rather a systemic issue with the technology, as evidenced by the number of complaints received by the National Highway Safety Administration (NHTSA) and the ongoing investigation into Tesla's technology [95680]. Additionally, the fact that the company did not respond to requests for comment on the research findings and did not indicate plans to fix the issues uncovered by the researchers suggests a more permanent nature of the software failure [95680].
Behaviour crash, omission, value (a) crash: The software failure incident in the article can be categorized as a crash. The incident involved Tesla vehicles accelerating over the speed limit due to a misreading of a speed limit sign by the car's camera system. This led to the cars automatically accelerating before being stopped by the driver, resulting in potential crashes [95680]. (b) omission: The software failure incident can also be categorized as an omission. The incident involved the system omitting to perform its intended function of correctly reading and interpreting the speed limit sign, leading to the incorrect acceleration of the vehicles [95680]. (d) value: Additionally, the software failure incident can be categorized as a value failure. The system performed its intended function of reading the speed limit sign but did so incorrectly, providing a value that was not accurate (85 mph instead of 35 mph), leading to the unintended acceleration of the vehicles [95680].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence death, harm (a) death: People lost their lives due to the software failure - The incident involving a Tesla Model X crashing on Highway 101 in Mountain View back in March 2018 resulted in one death. Walter Huang was traveling down the road when his vehicle suddenly accelerated a few seconds before the crash while driving in autopilot [95680].
Domain transportation The software failure incident reported in the news article [95680] is related to the transportation industry. Specifically, the incident involves Tesla vehicles being tricked into accelerating over the speed limit due to a flaw in the camera system's interpretation of speed limit signs. This incident highlights a weakness in the machine learning systems used in automated driving, which are crucial for the transportation industry. Additionally, the article mentions that the National Highway Safety Administration (NHTSA) received complaints about Tesla vehicles experiencing 'sudden unintended acceleration,' leading to accidents and injuries. These incidents occurred when drivers were parking in garages, at curbs, or even while in traffic, emphasizing the impact on transportation safety. Furthermore, the article discusses how Tesla's technology has been under scrutiny for accelerating over the speed limit, causing accidents and even deaths. The NHTSA is investigating Tesla for these faulty technology issues, further underscoring the transportation industry's relevance in this software failure incident.

Sources

Back to List