Published Date: 2018-01-15
Postmortem Analysis | |
---|---|
Timeline | 1. The software failure incident in Hawaii, where a false ballistic missile alert was sent out, happened on January 13, 2018 [Article 67026]. 2. The investigation into the incident was concluded on January 30, 2018, leading to the firing of the employee responsible for the false alert [Article 66872]. |
System | 1. Hawaiian Emergency Management Agency's computer system for sending emergency alerts [Article 67471] 2. Integrated Public Alert and Warning System (IPAWS) developed by FEMA [Article 67026] 3. Hawaii's alert system software design, management controls, and human factors [Article 66872] |
Responsible Organization | 1. The Hawaii Emergency Management Agency employee who triggered the false ballistic missile alert [Article 66872] 2. The computer system design of the Hawaiian Emergency Management Agency (HEMA) that allowed for the mistaken selection of the "missile alert" option instead of the "test missile alert" option [Article 67471] |
Impacted Organization | 1. Hawaii’s emergency management agency [67470, 67471, 66872] 2. Residents and visitors in Hawaii [67026] 3. The Federal Communications Commission (FCC) [67470, 67026, 66872] 4. Hawaii Governor David Ige [66872] |
Software Causes | 1. Poor computer software design contributed to the false missile alert incident in Hawaii [Article 66872]. 2. A "terribly designed" user interface in the computer system used by the Hawaiian Emergency Management Agency to send emergency alerts led to the false missile alert [Article 67471]. |
Non-software Causes | 1. Lack of proper management controls and poor computer software design contributed to the alert incident [Article 66872]. 2. Human factors, such as confusion and history of mistaking drills for real events, played a role in the failure incident [Article 66872]. 3. Insufficient training and communication within the Hawaii Emergency Management Agency led to the false alert incident [Article 66872]. 4. The practice of conducting drills during a shift change contributed to the confusion and error in sending out the alert [Article 66872]. |
Impacts | 1. The head of Hawaii’s emergency management agency resigned, a state employee who sent out the false alarm of an imminent missile attack was fired, another official quit, and a fourth was suspended as a result of the incident [67470]. 2. The Hawaii Emergency Management Agency employee who triggered the false ballistic missile alert was fired, the agency administrator resigned, another employee was suspended without pay, and a third employee resigned before any disciplinary action was taken [66872]. 3. The incident caused statewide panic, with people seeking shelter and contacting loved ones, fearing they could be living their last moments [67026]. 4. Efforts to correct the mistake were delayed due to issues such as lines being jammed by anxious callers and the governor not knowing his Twitter login [67470]. 5. The incident highlighted the need for improvements in management controls, computer software design, and human factors to prevent such errors in the future [66872]. 6. The false alarm led to chaos and confusion in Hawaii for 38 minutes until the error was rectified [66872]. 7. The incident raised concerns about the effectiveness and reliability of the alert system, prompting calls for a reevaluation and potential overhaul of the current system [67026]. |
Preventions | 1. Implementing a clearer and more user-friendly user interface design for the software system used to send emergency alerts, with distinct and easily differentiable options for test alerts versus real alerts, along with additional confirmation prompts to prevent accidental selection of the wrong option [Article 67471]. 2. Conducting regular training and drills for employees to ensure they are familiar with the procedures and protocols for sending out alerts, especially during shift changes, to avoid confusion and mistakes [Article 66872]. 3. Establishing stronger management controls and procedures within the emergency management agency to prevent human errors, such as having multiple levels of confirmation before sending out an alert and ensuring proper oversight of employees with a history of confusion between drills and real alerts [Article 66872]. |
Fixes | 1. Implement stronger confirmation prompts for those sending alerts to prevent accidental activation of real-world alerts [66872]. 2. Improve training for employees to ensure they can differentiate between drills and real-world events [66872]. 3. Eliminate practice drills during shift changes to avoid confusion and potential mistakes [66872]. 4. Update the computer software design to prevent easily overlooked differences between test alerts and real alerts [67471]. 5. Require a second person to confirm messages before they are sent out to prevent individual errors [67471]. | References | 1. Federal Communications Commission (FCC) [67470, 67471, 66872] 2. Hawaii Emergency Management Agency (HEMA) [67471] 3. Hawaii Governor David Ige [66872] 4. Hawaii State Adjutant General Maj. Gen. Joe Logan [66872] 5. Retired Brig. Gen. Bruce Oliveira [66872] 6. Computer security expert Graham Cluley [67471] 7. CNN [66872] 8. CNET [67026] |
Category | Option | Rationale |
---|---|---|
Recurring | one_organization, multiple_organization | (a) The software failure incident having happened again at one_organization: - The incident in Hawaii where a false ballistic missile alert was sent out due to human error was not the first time such a mistake had occurred. The employee who triggered the false alert had a history of confusing drill and real-world events [Article 66872]. - The same employee had twice before mistaken drills for real alerts, indicating a pattern of confusion in distinguishing between exercises and actual alerts [Article 67470]. (b) The software failure incident having happened again at multiple_organization: - The incident in Hawaii highlighted issues with the design of the computer system used by the Hawaiian Emergency Management Agency (HEMA) to send emergency alerts. The system's user interface was criticized for having a "terribly designed" menu that made it easy to select the wrong alert option [Article 67471]. - The Federal Communications Commission (FCC) report on the incident in Hawaii pointed out that there were problems with the software design, management controls, and human factors that contributed to the false alert, indicating broader issues beyond just one organization [Article 66872]. |
Phase (Design/Operation) | design, operation | (a) The software failure incident related to the design phase: The false ballistic missile alert in Hawaii was attributed to a "terribly designed" user interface in the computer system used by the Hawaiian Emergency Management Agency (HEMA) to send emergency alerts. The system had a drop-down menu with options for missile alerts, including one labeled "test missile alert" and another labeled "missile alert." The employee mistakenly selected the "missile alert" option instead of the intended "test missile alert" option, leading to the false alert being sent out to the public [Article 67471]. (b) The software failure incident related to the operation phase: The false ballistic missile alert incident in Hawaii was primarily caused by human error during a shift change at the emergency operation center. An employee misunderstood the drill call and incorrectly activated the "real-world" alert code, leading to the alert being sent out statewide. The employee who triggered the false alert had a history of confusing drill and real-world events, indicating an operational failure in ensuring proper procedures and confirmation prompts for sending alerts [Article 66872]. |
Boundary (Internal/External) | within_system, outside_system | (a) The software failure incident in Hawaii, where a false ballistic missile alert was sent out, was primarily caused by factors originating from within the system. The incident was triggered by an employee who mistakenly selected the "missile alert" option instead of the "test missile alert" option from a drop-down menu on the computer system used by the Hawaiian Emergency Management Agency [Article 67471]. The employee who sent out the alert had a history of confusing drill and real-world events, indicating an internal issue with the employee's understanding and actions [Article 66872]. (b) Contributing factors that originated from outside the system also played a role in the software failure incident. The Federal Communications Commission (FCC) report highlighted that Hawaii did not have reasonable safeguards in place to prevent human error from resulting in the transmission of a false alert, indicating a lack of external oversight or regulations to ensure proper alert procedures [Article 66872]. Additionally, the incident occurred during a time of high tensions over North Korea's nuclear program, which heightened the panic and response to the false alert, showing how external geopolitical factors influenced the impact of the software failure [Article 67470]. |
Nature (Human/Non-human) | non-human_actions, human_actions | (a) The software failure incident occurring due to non-human actions: - The false ballistic missile alert in Hawaii was triggered by an employee who mistakenly selected the "missile alert" option from a drop-down menu instead of the intended "test missile alert" option, due to a poorly designed user interface [Article 67471]. - The delay in canceling the false alert was attributed to poor computer software design, insufficient management controls, and human factors contributing to the incident [Article 66872]. (b) The software failure incident occurring due to human actions: - The employee who triggered the false ballistic missile alert in Hawaii had a history of confusing drill and real-world events, and he mistakenly sent out the alert believing the threat was real [Article 66872]. - The employee who sent out the false alert did not hear the full drill message, leading to the belief that the missile threat was real, and subsequently sending out the alert [Article 66872]. - The delay in correcting the false alert was partly due to human factors, such as the governor not being able to access his Twitter account promptly to send out a corrective message [Article 67470]. |
Dimension (Hardware/Software) | hardware, software | (a) The software failure incident occurring due to hardware: - The false ballistic missile alert in Hawaii was triggered by an employee who mistakenly selected the "missile alert" option instead of the "test missile alert" option on the computer system, leading to the alert being sent to every mobile phone in Hawaii [Article 67471]. - The delay in sending a follow-up message correcting the mistake was also attributed to a flaw in the system that restricts users to sending only prewritten messages, causing a delay in rectifying the false alert [Article 67026]. (b) The software failure incident occurring due to software: - The software design of the computer system used by the Hawaiian Emergency Management Agency was criticized for having a "terribly designed" user interface, with the genuine alert option placed next to the harmless test alert option, leading to confusion and the selection of the wrong option [Article 67471]. - The Federal Communications Commission (FCC) report highlighted that insufficient management controls, poor computer software design, and human factors contributed to the false alert incident in Hawaii [Article 66872]. |
Objective (Malicious/Non-malicious) | non-malicious | (a) The software failure incident in Hawaii, where a false ballistic missile alert was sent out, was non-malicious. The incident was triggered by an employee who mistakenly selected the "missile alert" option instead of the "test missile alert" option from a drop-down menu on the computer system used by the Hawaiian Emergency Management Agency (HEMA) to send emergency alerts. This employee did not realize it was an exercise and clicked through the confirmation prompt, leading to the alert being sent out to the public [Article 67471]. (b) The incident was further exacerbated by a series of human errors, poor software design, and lack of proper management controls. The employee who triggered the alert had a history of confusing drills with real-world events, and there were issues with the confirmation prompts for sending alerts, as well as delays in issuing a correction message. The FCC report highlighted that there were insufficient safeguards in place to prevent human error from resulting in a false alert, indicating a non-malicious failure [Article 66872]. |
Intent (Poor/Accidental Decisions) | poor_decisions, accidental_decisions | (a) The intent of the software failure incident was due to poor_decisions: - The incident was caused by a series of poor decisions and management controls, poor computer software design, and human factors that contributed to the false ballistic missile alert in Hawaii. The internal investigation found that insufficient management controls were in place, along with poor software design and human errors [Article 66872]. - The FCC report highlighted that there were insufficient safeguards in place to prevent human error from resulting in the transmission of a false alert, indicating poor decision-making in the setup of the alert system [Article 66872]. (b) The intent of the software failure incident was due to accidental_decisions: - The false ballistic missile alert in Hawaii was triggered by an employee who mistakenly believed the threat was real, leading to the accidental activation of the alert system. The employee had a history of confusing drill and real-world events, indicating an accidental decision rather than intentional action [Article 66872]. - The employee who sent out the false alert stated that he did not hear the drill call indicating it was an exercise, leading to the accidental belief that the missile threat was real and sending out the alert [Article 66872]. |
Capability (Incompetence/Accidental) | development_incompetence, accidental | (a) The software failure incident occurring due to development_incompetence: - The false ballistic missile alert in Hawaii was triggered by an employee who had a history of confusing drill and real-world events, indicating a lack of professional competence in handling such critical situations [Article 66872]. - The computer system used by the Hawaiian Emergency Management Agency had a poorly designed user interface, with options for a test missile alert and a real missile alert placed closely together, leading to the employee mistakenly selecting the wrong option [Article 67471]. (b) The software failure incident occurring accidentally: - The employee who triggered the false ballistic missile alert in Hawaii claimed he did not hear the drill call of "exercise, exercise, exercise" and believed the missile threat was real, leading to the accidental activation of the alert [Article 66872]. - The mistake of sending out the false alert was attributed to human error during a shift change drill, where the fired employee incorrectly activated the "real-world" alert code instead of conducting a drill, indicating an accidental error in the process [Article 66872]. |
Duration | temporary | (a) The software failure incident in Hawaii related to the false ballistic missile alert was temporary. The incident was triggered by an employee mistakenly selecting the "missile alert" option instead of the "test missile alert" option from a drop-down menu, leading to the issuance of a real alert instead of a drill [Article 67471]. The employee who sent out the alert believed it was real due to a misunderstanding during a shift change and hearing the phrase "this is not a drill" without hearing the full exercise announcement [Article 67470]. The incident lasted for about 38 minutes until a follow-up message correcting the mistake was sent out [Article 67026]. (b) The software failure incident was also influenced by contributing factors introduced by certain circumstances but not all. The incident was attributed to insufficient management controls, poor computer software design, and human factors [Article 66872]. The employee who triggered the false alert had a history of confusing drill and real-world events, indicating a specific human factor contributing to the failure [Article 66872]. Additionally, the FCC report highlighted that Hawaii did not have reasonable safeguards in place to prevent human error from resulting in a false alert, indicating a lack of certain necessary measures to prevent such incidents [Article 66872]. |
Behaviour | crash, omission, timing, value, other | (a) crash: The software failure incident in Hawaii can be categorized as a crash. The incident involved a false ballistic missile alert being sent out, causing panic and confusion among the residents. The system lost its state and did not perform its intended function of sending out a drill alert, leading to a real-world alert being issued mistakenly [66872]. (b) omission: The software failure incident can also be categorized as an omission. The employee responsible for sending the alert omitted to perform the intended function of sending a drill alert and instead sent out a real-world alert, causing widespread panic and chaos [66872]. (c) timing: The software failure incident can be categorized as a timing issue. The system did perform its intended function of sending out an alert, but it did so at the wrong time, causing confusion and fear among the recipients. The delay in sending a follow-up message to correct the mistake also contributed to the timing issue [67026]. (d) value: The software failure incident can be categorized as a value issue. The system performed its intended function of sending out an alert, but it did so incorrectly by sending a real-world alert instead of a drill alert. This incorrect action led to significant consequences and panic among the population [66872]. (e) byzantine: The software failure incident does not align with a byzantine behavior as described in the articles. (f) other: The software failure incident can be categorized as a user interface design flaw. The incident was attributed to a poorly designed user interface that made it easy for the employee to select the wrong option from a drop-down menu, leading to the false alert being sent out. The one-word difference between the options and the lack of clear prompts contributed to the error [67471]. |
Layer | Option | Rationale |
---|---|---|
Perception | None | None |
Communication | None | None |
Application | None | None |
Category | Option | Rationale |
---|---|---|
Consequence | death, basic, delay, theoretical_consequence | (a) death: People lost their lives due to the software failure - The false ballistic missile alert in Hawaii caused panic and fear among residents and visitors, with people seeking shelter and contacting loved ones, believing they had only minutes until detonation [Article 67470]. - The delayed correction of the false alert left millions of people fearing their lives would soon end [Article 67026]. (b) harm: People were physically harmed due to the software failure - There is no direct mention of physical harm caused to individuals due to the software failure in the articles. (c) basic: People's access to food or shelter was impacted because of the software failure - The false missile alert led to people seeking immediate shelter, impacting their access to shelter [Article 67470]. (d) property: People's material goods, money, or data was impacted due to the software failure - There is no direct mention of people's material goods, money, or data being impacted due to the software failure in the articles. (e) delay: People had to postpone an activity due to the software failure - The delayed correction of the false alert caused a 38-minute period of panic and fear among the residents and visitors in Hawaii [Article 66872]. (f) non-human: Non-human entities were impacted due to the software failure - The false ballistic missile alert incident primarily affected human individuals, causing panic and fear among the population [Article 67470, Article 67026, Article 66872]. (g) no_consequence: There were no real observed consequences of the software failure - The software failure incident involving the false ballistic missile alert in Hawaii had significant consequences, including panic, fear, and confusion among the residents and visitors [Article 67470, Article 67026, Article 66872]. (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur - The FCC report highlighted that there were no actual ballistic missiles launched, and the false alert was a result of human error, causing panic and fear but not resulting in an actual attack [Article 66872]. (i) other: Was there consequence(s) of the software failure not described in the (a to h) options? What is the other consequence(s)? - There is no other consequence mentioned in the articles beyond the panic, fear, confusion, and delayed correction caused by the false ballistic missile alert incident in Hawaii. |
Domain | government | (a) The failed system was intended to support the government industry. The incident involved the Hawaii Emergency Management Agency mistakenly sending out a false ballistic missile alert, causing panic and confusion among the residents and visitors of Hawaii. The incident led to the resignation of the agency's chief and the firing of the employee responsible for the false alarm [67470, 67471, 66872]. (l) The failed system was also related to the government industry as it involved the Hawaii Emergency Management Agency, a state organization responsible for coordinating emergency response. The incident highlighted issues with management controls, software design, and human factors that contributed to the false alert and the delayed correction message. The agency administrator resigned, and other employees faced disciplinary actions [66872]. (m) The incident was not related to an industry outside of the options provided. |
Article ID: 67470
Article ID: 67471
Article ID: 67026
Article ID: 66872