| Recurring |
unknown |
(a) The article does not mention any previous incidents of software failure within Audi or with its self-driving technology. Therefore, there is no information available to suggest that a similar incident has happened before within the same organization [32613].
(b) The article does not provide any information about similar incidents happening at other organizations with their self-driving technologies. Hence, there is no evidence to suggest that a similar incident has occurred at multiple organizations [32613]. |
| Phase (Design/Operation) |
design, operation |
(a) The article mentions that during a ride in Audi's self-driving car at a previous event, the system failed, and the driver had to take over. This incident occurred after a year of development, indicating that there were glitches that needed to be addressed during the design phase of the system [32613].
(b) The article also states that when the self-driving car approaches an urban area, the system will alert the driver to take over manual control. If the driver does not take over within a set amount of time, the car will turn on its flashers and pull over onto the shoulder. This highlights a safety measure in place to address potential failures related to the operation or misuse of the system [32613]. |
| Boundary (Internal/External) |
within_system |
(a) The software failure incident mentioned in the article is related to the boundary of the system. The failure occurred within the system itself, specifically with the self-driving technology developed by Audi for the A7 car. The article describes how during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over [32613]. This indicates that the failure was internal to the self-driving technology system developed by Audi. |
| Nature (Human/Non-human) |
non-human_actions |
(a) The software failure incident occurring due to non-human actions:
The article mentions that during a ride in Audi's self-driving car along a freeway in Las Vegas at a previous event, the system failed, and the driver had to take over [32613]. This indicates a software failure incident that occurred without human participation, possibly due to glitches or faults in the system itself.
(b) The software failure incident occurring due to human actions:
The article does not provide specific information about a software failure incident occurring due to contributing factors introduced by human actions. |
| Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident related to hardware:
The article mentions that the Audi A7 self-driving car experienced a system failure during a ride along a freeway in Las Vegas, which required the driver to take over [32613]. This incident could potentially be attributed to hardware issues within the sensors or the onboard computer processing the information from the sensors.
(b) The software failure incident related to software:
The article discusses the self-driving technology called "Piloted Driving" by Audi, which relies on processing information from various sensors and the car's GPS location through an onboard computer to control braking, acceleration, and steering [32613]. If there was a failure in the software algorithms or programming controlling these functions, it could lead to a software-related failure incident. |
| Objective (Malicious/Non-malicious) |
non-malicious |
(a) The articles do not mention any malicious software failure incident related to the self-driving car developed by Audi. [32613]
(b) The article discusses a non-malicious software failure incident where during a ride in Audi's self-driving car along a freeway in Las Vegas, the system failed, and the driver had to take over. This incident was not attributed to any malicious intent but rather to a glitch or issue in the system that required manual intervention. Audi had been working on solving glitches in their self-driving technology, and this incident highlighted the need for further development and refinement of the system to ensure smooth operation without failures. [32613] |
| Intent (Poor/Accidental Decisions) |
unknown |
The articles do not provide information about a software failure incident related to poor decisions or accidental decisions. |
| Capability (Incompetence/Accidental) |
development_incompetence |
(a) The article mentions a previous incident where during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over. This incident could be attributed to development incompetence as it indicates that there were glitches in the system even after a year of development [32613].
(b) The article does not provide specific information about a software failure incident occurring due to accidental factors. |
| Duration |
unknown |
The articles do not provide specific information about a software failure incident being permanent or temporary. |
| Behaviour |
crash |
(a) crash: The article mentions a previous incident where during a ride in Audi's self-driving car at CES, the system failed, and the driver had to take over, indicating a crash scenario where the system lost state and did not perform its intended functions [32613].
(b) omission: The article does not specifically mention any instance where the system omitted to perform its intended functions at an instance.
(c) timing: The article does not provide information about the system performing its intended functions too late or too early.
(d) value: The article does not mention any instance where the system performed its intended functions incorrectly.
(e) byzantine: The article does not describe any inconsistent responses or interactions by the system.
(f) other: The behavior of the software failure incident in this case is a crash, where the system failed, and the driver had to take over during a previous ride in Audi's self-driving car [32613]. |