Recurring |
one_organization, multiple_organization |
(a) The software failure incident related to hacking industrial vehicles has happened again within the same organization. The article mentions that cybersecurity researchers demonstrated in recent years that they could hack a Chevy Impala or a Jeep Cherokee to disable the vehicles' brakes or hijack their steering, which served as a wakeup call to the consumer automotive industry. However, industrial automakers are now facing a similar reminder as researchers from the University of Michigan presented findings of tests on industrial vehicles, including big rig trucks and school buses, showing vulnerabilities in their computer networks [46618].
(b) The software failure incident related to hacking industrial vehicles has also happened at multiple organizations. The article highlights that the researchers targeted most of their attacks on a 2006 semi-trailer and a 2001 school bus, without revealing the manufacturers to avoid unnecessary embarrassment. This indicates that similar vulnerabilities exist in various industrial vehicles that use the same communication standard, making them susceptible to hacking attacks [46618]. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase is evident in the article. The researchers at the University of Michigan were able to hack into industrial vehicles, such as big rig trucks and school buses, by sending digital signals within the internal network of the vehicles. They found that developing these attacks was easier than with consumer cars due to a common communication standard in the internal networks of most industrial vehicles [46618]. This vulnerability was exploited by the researchers to change the readout of the truck's instrument panel, trigger unintended acceleration, disable brakes, and even speed up the truck against the driver's will by sending spoofed signals to the vehicle's powertrain commands [46618].
(b) The software failure incident related to the operation phase is also highlighted in the article. The researchers were able to perform their tests by plugging a laptop directly into the OBD port on the dashboard of the target trucks, rather than searching for a wireless entry point into the vehicle that an actual malicious hacker would likely need. They argue that motivated attackers will find vulnerabilities offering over-the-Internet access to vehicles' vulnerable digital systems, and that attacks exploiting cellular connections to vehicles' infotainment systems have already been demonstrated [46618]. This indicates that the operation and use of the vehicles, including the presence of insecure telematics dongles that track gas mileage and location, can introduce factors leading to software failures and vulnerabilities. |
Boundary (Internal/External) |
within_system |
(a) within_system: The software failure incident described in the article is primarily within the system. The researchers were able to hack into the internal network of industrial vehicles, such as big rig trucks and school buses, by sending digital signals within the system. They were able to manipulate various functions of the vehicles, including changing instrument panel readouts, triggering unintended acceleration, and disabling brakes, all by exploiting the common communication standard in the internal networks of these vehicles [46618]. The vulnerabilities exploited were inherent within the system's design and communication protocols.
(b) outside_system: The article does not provide information indicating that the software failure incident was caused by contributing factors originating from outside the system. The focus of the incident was on vulnerabilities within the internal networks of the industrial vehicles, rather than external factors. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident in the articles is primarily related to non-human actions. The incident involved cybersecurity researchers demonstrating how they could hack into industrial vehicles, such as a big rig truck and a school bus, by sending digital signals within the internal network of the vehicles. They were able to trigger unintended acceleration, disable brakes, change instrument panel readouts, and even speed up the vehicles against the driver's will by exploiting vulnerabilities in the vehicles' communication standard [46618]. The vulnerabilities in the vehicles' systems were exploited without direct human intervention in the vehicles' operation.
(b) While the software failure incident primarily resulted from non-human actions, it is important to note that the vulnerabilities exploited by the researchers were introduced by human actions during the design and implementation of the vehicles' communication systems. The use of a common communication standard in industrial vehicles made it easier for the researchers to craft attacks and manipulate the vehicles' systems. The researchers highlighted the need for the heavy-duty automakers to focus on defending against digital attacks on the vehicles' systems and suggested measures such as better segregating components of the vehicles' networks and implementing authentication measures to prevent impersonation of messages [46618]. |
Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident reported in the articles is primarily related to hardware vulnerabilities in industrial vehicles. The researchers were able to exploit vulnerabilities in the internal network of big rig trucks by sending digital signals, which allowed them to manipulate various functions of the vehicles, such as changing instrument panel readouts, triggering unintended acceleration, and disabling brakes [46618].
(b) The software failure incident is also related to software vulnerabilities in the communication standard used in industrial vehicles. The J1939 open standard common to heavy vehicles allowed the researchers to easily send commands and replicate signals on the vehicles' networks without the need for extensive reverse engineering. This lack of standardization in industrial trucks made it simpler for the researchers to craft attacks compared to consumer vehicles [46618]. |
Objective (Malicious/Non-malicious) |
malicious |
(a) The software failure incident described in the articles is malicious in nature. Cybersecurity researchers demonstrated how they could hack into industrial vehicles, such as big rig trucks and school buses, to manipulate various functions like changing instrument panel readouts, triggering unintended acceleration, disabling brakes, and even speeding up the vehicles against the driver's will [46618]. The researchers were able to send digital signals within the internal network of the vehicles to carry out these malicious actions, highlighting the vulnerabilities in the systems that could be exploited by attackers with harmful intent. |
Intent (Poor/Accidental Decisions) |
poor_decisions |
(a) The intent of the software failure incident related to poor decisions can be seen in the article. The researchers from the University of Michigan were able to hack into industrial vehicles, such as big rig trucks and school buses, by exploiting vulnerabilities in the common communication standard used in these vehicles' internal networks. They were able to easily send digital signals to manipulate various aspects of the vehicles, including changing instrument panel readouts, triggering unintended acceleration, and disabling brakes [46618].
Furthermore, the researchers found that the J1939 open standard common to heavy vehicles allowed them to send commands that could alter the truck's instrument panel readouts, spoof fuel levels, prevent alerts about compressed air in brakes, and even disable certain forms of brakes. These actions could potentially lead to dangerous situations on the road, highlighting the risks associated with the vulnerabilities in the software systems of these industrial vehicles [46618]. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident in the article can be attributed to development incompetence. The researchers from the University of Michigan were able to hack into industrial vehicles, such as a big rig truck and a school bus, by exploiting vulnerabilities in the common communication standard used in those vehicles' internal networks. They found that developing these attacks was actually easier than with consumer cars due to the open nature of the J1939 standard common to heavy vehicles, allowing them to craft attacks without the need for extensive reverse engineering. The simplicity of these hacks and the ease with which they were able to manipulate critical systems in the vehicles highlight the lack of robust security measures in place [46618].
(b) The software failure incident can also be considered accidental to some extent. The researchers demonstrated how they were able to send digital signals within the internal network of a big rig truck to trigger unintended acceleration, disable brakes, and manipulate various instrument panel readouts. While their tests were conducted by directly connecting a laptop to the vehicles' on-board diagnostic ports, they acknowledged that motivated attackers could potentially find vulnerabilities offering over-the-Internet access to the vehicles' digital systems. This accidental exposure of vulnerabilities in the vehicles' networks raises concerns about the potential for remote attacks on heavy vehicles, highlighting the unintentional risks associated with the lack of adequate security measures in place [46618]. |
Duration |
temporary |
The software failure incident described in the article is more aligned with a temporary failure rather than a permanent one. The incident involved cybersecurity researchers demonstrating how they could hack into industrial vehicles, such as big rig trucks and school buses, by sending digital signals within the internal network of the vehicles. The researchers were able to manipulate various aspects of the vehicles' systems, including changing instrument panel readouts, triggering unintended acceleration, and disabling brakes [46618]. The ease with which the researchers were able to carry out these hacks, the use of a common communication standard in the internal networks of most industrial vehicles, and the ability to send commands that precisely changed the readouts of the instrument panel all point towards a temporary failure that was due to specific circumstances and vulnerabilities in the system rather than a permanent, inherent flaw in the software. |
Behaviour |
crash, omission, value, other |
(a) crash: The software failure incident described in the articles can be categorized as a crash. The incident involved the researchers being able to send digital signals within the internal network of a big rig truck, which resulted in various unintended consequences such as changing the readout of the truck's instrument panel, triggering unintended acceleration, and disabling the truck's brakes [46618].
(b) omission: The software failure incident can also be categorized as an omission. The researchers were able to prevent an alert that the truck was about to run out of compressed air in its air brakes, leading to the vehicle instead applying its emergency brake without warning. This omission of the alert function could potentially lead to dangerous situations for the driver [46618].
(c) timing: The software failure incident does not align with a timing failure as the system was not described as performing its intended functions too late or too early in the articles.
(d) value: The software failure incident can be categorized as a value failure. The researchers were able to spoof a full tank of gas when the truck was running out of fuel, which is an incorrect performance of the system's intended function related to providing accurate fuel level information [46618].
(e) byzantine: The software failure incident does not align with a byzantine failure as the system's behavior was not described as inconsistent or having conflicting responses and interactions in the articles.
(f) other: The other behavior exhibited in the software failure incident is the ability of the researchers to remotely manipulate the truck's behavior, such as speeding up the truck against the driver's will by sending signals spoofing the vehicle's powertrain commands. This type of remote control over the vehicle's functions goes beyond a typical crash or omission failure and highlights the potential for malicious external manipulation of the system [46618]. |