Incident: Self-Driving System Failure During Audi A7 Road Trip.

Published Date: 2015-01-04

Postmortem Analysis
Timeline 1. The software failure incident with Audi's self-driving car happened during a ride along a freeway in Las Vegas at last year's CES [32613]. Estimation: Step 1: The article was published on 2015-01-04. Step 2: The incident occurred at last year's CES, which typically takes place in January. Step 3: Therefore, the software failure incident likely occurred in January of the previous year, which would be January 2014.
System unknown
Responsible Organization 1. The software failure incident in the Audi self-driving car was caused by the system itself, as it failed during a ride along a freeway in Las Vegas, prompting the driver to take over [32613].
Impacted Organization 1. The driver of Audi's self-driving car during the CES ride in Las Vegas [32613].
Software Causes 1. The software cause of the failure incident in the Audi self-driving car during the CES ride in Las Vegas was not explicitly mentioned in the provided article [32613]. Therefore, the specific software causes of the failure incident are unknown.
Non-software Causes 1. The failure incident was caused by the system's inability to handle urban areas, prompting the driver to take over [32613].
Impacts 1. The software failure incident during the ride in Audi's self-driving car at CES in Las Vegas resulted in the driver having to take over, indicating a lack of full autonomy and reliability in the system [32613].
Preventions 1. Conducting more extensive testing and simulations to identify and address potential glitches or failures in the self-driving system before deploying it on a long road trip [32613]. 2. Implementing better redundancy and fail-safe mechanisms in the self-driving software to ensure a smooth transition to manual control in case of system failure [32613]. 3. Enhancing the system's ability to adapt to changing road conditions, such as disappearing lane lines, by improving the integration of GPS data and sensor inputs for more robust decision-making [32613].
Fixes 1. Conduct thorough testing and validation of the self-driving technology to identify and address any potential glitches or issues before deploying it on a road trip [32613]. 2. Implement redundant sensors and backup systems to ensure fail-safes in case of sensor failures or inaccuracies [32613]. 3. Enhance the software algorithms to improve the system's ability to handle various driving conditions, including urban environments with obstacles and threats [32613].
References 1. The articles gather information about the software failure incident from the experience of a ride in Audi's self-driving car along a freeway in Las Vegas at last year's CES, where the system failed and the driver had to take over [32613].

Software Taxonomy of Faults

Category Option Rationale
Recurring unknown (a) The article does not mention any previous incidents of software failure within Audi or with its self-driving technology. Therefore, there is no information available to suggest that a similar incident has happened before within the same organization [32613]. (b) The article does not provide any information about similar incidents happening at other organizations with their self-driving technologies. Hence, there is no evidence to suggest that a similar incident has occurred at multiple organizations [32613].
Phase (Design/Operation) design, operation (a) The article mentions that during a ride in Audi's self-driving car at a previous event, the system failed, and the driver had to take over. This incident occurred after a year of development, indicating that there were glitches that needed to be addressed during the design phase of the system [32613]. (b) The article also states that when the self-driving car approaches an urban area, the system will alert the driver to take over manual control. If the driver does not take over within a set amount of time, the car will turn on its flashers and pull over onto the shoulder. This highlights a safety measure in place to address potential failures related to the operation or misuse of the system [32613].
Boundary (Internal/External) within_system (a) The software failure incident mentioned in the article is related to the boundary of the system. The failure occurred within the system itself, specifically with the self-driving technology developed by Audi for the A7 car. The article describes how during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over [32613]. This indicates that the failure was internal to the self-driving technology system developed by Audi.
Nature (Human/Non-human) non-human_actions (a) The software failure incident occurring due to non-human actions: The article mentions that during a ride in Audi's self-driving car along a freeway in Las Vegas at a previous event, the system failed, and the driver had to take over [32613]. This indicates a software failure incident that occurred without human participation, possibly due to glitches or faults in the system itself. (b) The software failure incident occurring due to human actions: The article does not provide specific information about a software failure incident occurring due to contributing factors introduced by human actions.
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: The article mentions that the Audi A7 self-driving car experienced a system failure during a ride along a freeway in Las Vegas, which required the driver to take over [32613]. This incident could potentially be attributed to hardware issues within the sensors or the onboard computer processing the information from the sensors. (b) The software failure incident related to software: The article discusses the self-driving technology called "Piloted Driving" by Audi, which relies on processing information from various sensors and the car's GPS location through an onboard computer to control braking, acceleration, and steering [32613]. If there was a failure in the software algorithms or programming controlling these functions, it could lead to a software-related failure incident.
Objective (Malicious/Non-malicious) non-malicious (a) The articles do not mention any malicious software failure incident related to the self-driving car developed by Audi. [32613] (b) The article discusses a non-malicious software failure incident where during a ride in Audi's self-driving car along a freeway in Las Vegas, the system failed, and the driver had to take over. This incident was not attributed to any malicious intent but rather to a glitch or issue in the system that required manual intervention. Audi had been working on solving glitches in their self-driving technology, and this incident highlighted the need for further development and refinement of the system to ensure smooth operation without failures. [32613]
Intent (Poor/Accidental Decisions) unknown The articles do not provide information about a software failure incident related to poor decisions or accidental decisions.
Capability (Incompetence/Accidental) development_incompetence (a) The article mentions a previous incident where during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over. This incident could be attributed to development incompetence as it indicates that there were glitches in the system even after a year of development [32613]. (b) The article does not provide specific information about a software failure incident occurring due to accidental factors.
Duration unknown The articles do not provide specific information about a software failure incident being permanent or temporary.
Behaviour crash (a) crash: The article mentions a previous incident where during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over, indicating a crash scenario where the system lost state and did not perform its intended functions [32613]. (b) omission: The article does not specifically mention any instance where the system omitted to perform its intended functions at an instance. (c) timing: The article does not provide information about the system performing its intended functions too late or too early. (d) value: The article does not mention any instance where the system performed its intended functions incorrectly. (e) byzantine: The article does not describe any inconsistent responses or interactions by the system. (f) other: The behavior of the software failure incident in this case is a crash, where the system failed, and the driver had to take over during a previous ride in Audi's self-driving car [32613].

IoT System Layer

Layer Option Rationale
Perception sensor (a) The failure related to the perception layer of the cyber physical system that failed was due to contributing factors introduced by sensor error. The article mentions that the Audi A7 self-driving car was equipped with various sensors such as long-range forward radar, rear-facing and side-facing radar sensors, a laser scanner (LIDAR), a 3D camera, and four smaller cameras. These sensors were responsible for collecting information about the car's surroundings and the car's GPS location, which was then processed by an onboard computer to control braking, acceleration, and steering. The article also highlights the importance of lane lines for guiding the car's path and mentions that if lane lines disappear due to road construction or weathering, the car would need to rely on its GPS and relative distance from other traffic around [32613].
Communication unknown The article does not mention any specific software failure incident related to the communication layer of the cyber physical system that failed. Therefore, it is unknown whether the failure was at the link_level or connectivity_level.
Application FALSE Unknown

Other Details

Category Option Rationale
Consequence no_consequence The articles do not mention any software failure incident leading to consequences such as death, harm, basic needs impact, property loss, delay, or non-human entities being affected. There is no discussion of potential consequences or other consequences resulting from the software failure incident. Therefore, the consequence of the software failure incident in the provided article is 'no_consequence' [32613].
Domain transportation (a) The failed system was intended to support the transportation industry. The article mentions Audi's self-driving technology, Piloted Driving, which was being tested in an Audi A7 for a road trip from the San Francisco Bay Area to Las Vegas [32613]. The system was designed to control braking, acceleration, and steering of the vehicle, indicating its application in the transportation sector.

Sources

Back to List