| Recurring |
unknown |
(a) The software failure incident related to hacking planes using the SIMON framework and PlaneSploit app developed by Hugo Teso has not been reported to have happened again within the same organization or with its products and services. The incident was a unique demonstration by Teso at a security summit in Amsterdam, and there is no mention of a repeat occurrence within the same organization.
(b) There is no specific mention in the article of a similar incident happening at other organizations or with their products and services. The focus of the article is primarily on the demonstration by Hugo Teso of his hacking tools and the potential vulnerabilities in air-traffic security systems. |
| Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase can be seen in the article where a German security consultant and commercial pilot, Hugo Teso, developed a framework of malicious code called SIMON and an Android app called PlaneSploit to exploit airline security software [18263]. Teso spent three years coding these tools, demonstrating the ability to remotely control a virtual airplane by sending radio signals to its flight-management system, highlighting weaknesses in the authentication methods of current security systems. This design failure introduced by the development of the malicious software could potentially lead to serious security vulnerabilities in airline systems.
(b) The software failure incident related to the operation phase is evident in the article where Hugo Teso showcased how his tools could be used to change the speed, altitude, and direction of a virtual airplane, manipulate pilot displays, and control the aircraft remotely by tapping preloaded commands on the Android app [18263]. This demonstrates the potential for misuse of the system in operation, as unauthorized individuals could exploit these vulnerabilities to interfere with the navigation and control of an aircraft, posing a significant operational risk. |
| Boundary (Internal/External) |
within_system |
(a) within_system: The software failure incident described in the article is primarily within the system. The incident involved a German security consultant developing a framework of malicious code called SIMON and an Android app called PlaneSploit to exploit airline security software [18263]. The software allowed the consultant to remotely control a virtual airplane by sending radio signals to its flight-management system, demonstrating the ability to change the speed, altitude, and direction of the aircraft. The software was designed to work in virtual environments and not on actual aircraft, indicating that the failure originated from within the system itself.
(b) outside_system: The software failure incident does not involve contributing factors that originate from outside the system. The incident was contained within the development and demonstration of the malicious code and app by the security consultant, with no external factors mentioned as contributing to the failure [18263]. |
| Nature (Human/Non-human) |
non-human_actions |
(a) The software failure incident in the article is related to non-human actions. The incident involved a German security researcher developing an app called SIMON, along with an Android app called PlaneSploit, to exploit airline security software and potentially hijack an airplane remotely. The software was designed to work only on simulations and not on actual aircraft, indicating that the failure was due to factors introduced without human participation [18263].
(b) The software failure incident in the article is not directly related to human actions causing the failure. While the security researcher, Hugo Teso, developed the malicious software, the failure itself was due to vulnerabilities in the airline security systems and the lack of strong authentication methods, rather than specific human actions leading to the failure [18263]. |
| Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident in the article is related to hardware. The incident involved a German security consultant developing a framework of malicious code called SIMON and an Android app called PlaneSploit to exploit airline security software by sending radio signals to the flight-management system of a virtual airplane. The consultant used flight-management hardware bought on eBay and publicly available flight-simulator software to demonstrate the vulnerabilities [18263].
(b) The software failure incident in the article is also related to software. The incident involved the development of the SIMON framework and the PlaneSploit Android app, which were designed to attack and exploit airline security software. The software allowed for the remote control of a virtual airplane by modifying navigation parameters, changing pilot display screens, and controlling cockpit lights [18263]. |
| Objective (Malicious/Non-malicious) |
malicious |
(a) The objective of the software failure incident was malicious, as the German security consultant and commercial pilot, Hugo Teso, developed a framework of malicious code called SIMON and an Android app called PlaneSploit with the intent to attack and exploit airline security software. He demonstrated the ability to remotely hijack an airplane by sending radio signals to its flight-management system, allowing him to change the speed, altitude, and direction of a virtual airplane. Teso also mentioned that the tools could be used to manipulate various aspects of the plane's navigation and control, such as changing what appears on the pilot's display screen or turning off lights in the cockpit [18263].
(b) The software failure incident was non-malicious in the sense that the Federal Aviation Administration stated that the hacking technique demonstrated by Teso does not pose a threat to real flights because it does not work on certified flight hardware. The FAA clarified that the described technique cannot engage or control the aircraft's autopilot system using the Flight Management System or prevent a pilot from overriding the autopilot. Teso himself mentioned that he developed SIMON to work only in virtual environments, not on actual aircraft. Additionally, Teso informed the relevant aviation safety officials in the United States and Europe about his findings, indicating a responsible approach to disclosure rather than putting aircraft and passengers at risk [18263]. |
| Intent (Poor/Accidental Decisions) |
poor_decisions |
(a) The intent of the software failure incident related to poor_decisions:
The software failure incident described in the article was related to poor decisions made by the German security consultant and commercial pilot, Hugo Teso. He developed a framework of malicious code called SIMON and an Android app called PlaneSploit with the intent to demonstrate vulnerabilities in airline security systems. Despite claiming that his tools were only meant for simulations and not real flights, the potential risks and implications of his actions were significant. Teso's decision to create and showcase these tools raised concerns about the security of air-traffic systems and the potential for exploitation by malicious actors [18263].
(b) The intent of the software failure incident related to accidental_decisions:
The software failure incident was not related to accidental decisions or unintended mistakes. It was a deliberate effort by Hugo Teso to develop tools that could exploit vulnerabilities in airline security systems. Teso spent three years coding the tools he used and demonstrated their capabilities at a security summit in Amsterdam. His actions were intentional and aimed at highlighting the weaknesses in current security systems rather than being accidental or unintended [18263]. |
| Capability (Incompetence/Accidental) |
accidental |
(a) The software failure incident related to development incompetence is not applicable in this case as the incident was not caused by a lack of professional competence by humans or the development organization. The incident was a result of intentional actions by a security researcher who developed a framework of malicious code and an Android app to exploit airline security software [18263].
(b) The software failure incident can be categorized as accidental as the security researcher, Hugo Teso, accidentally discovered vulnerabilities in airline security systems and developed tools that could potentially be used to remotely hijack an airplane. The incident was not intentional but rather a result of accidental discovery and exploitation of weaknesses in the systems [18263]. |
| Duration |
temporary |
(a) The software failure incident described in the article is more of a temporary nature. The incident involved a security researcher developing an app called SIMON, along with an Android app called PlaneSploit, that could potentially be used to hack into and remotely control airplanes. The researcher demonstrated the ability to manipulate a virtual airplane's speed, altitude, and direction by sending radio signals to its flight-management system. However, it was clarified that the hacking technique does not pose a threat on real flights because it does not work on certified flight hardware. The Federal Aviation Administration stated that the described technique cannot engage or control the aircraft's autopilot system using the Flight Management System or prevent a pilot from overriding the autopilot. The researcher also mentioned that he developed SIMON to work only in virtual environments, not on actual aircraft [18263].
(b) The incident could be considered temporary as the software failure was demonstrated in a controlled environment using a flight simulator and specific hardware and software products. The researcher used flight-management hardware bought on eBay and publicly available flight-simulator software that contains at least some of the same computer coding as real flight software. While the risk was acknowledged by some experts, it was emphasized that the researcher had not disclosed all the details of the vulnerabilities he exploited, and steps could be taken to patch any security holes before they could be maliciously exploited. The researcher also reached out to the companies that make the systems he exploited and aviation safety officials in the United States and Europe to address his concerns [18263]. |
| Behaviour |
other |
(a) crash: The software failure incident described in the article does not involve a crash where the system loses state and does not perform any of its intended functions. Instead, the incident involves a demonstration of how a security consultant developed a framework of malicious code that could be used to attack and exploit airline security software [18263].
(b) omission: The software failure incident does not involve a failure due to the system omitting to perform its intended functions at an instance(s). Instead, the incident revolves around the demonstration of tools that could be used to hijack an airplane remotely by exploiting vulnerabilities in airline security software [18263].
(c) timing: The software failure incident does not involve a failure due to the system performing its intended functions correctly but too late or too early. The incident focuses on the potential exploitation of security vulnerabilities in airline systems rather than issues related to timing [18263].
(d) value: The software failure incident does not involve a failure due to the system performing its intended functions incorrectly. The incident is centered around the development of a framework of malicious code that could be used to attack and exploit airline security software, showcasing potential vulnerabilities in the systems [18263].
(e) byzantine: The software failure incident does not involve a failure due to the system behaving erroneously with inconsistent responses and interactions. The incident primarily highlights the potential risks associated with exploiting security vulnerabilities in airline systems, as demonstrated by the security consultant's tools [18263].
(f) other: The behavior of the software failure incident can be categorized as a security vulnerability demonstration where a security consultant showcased the potential for exploiting weaknesses in airline security software to remotely control an airplane using a developed framework of malicious code and an Android app [18263]. |