Incident: Most Healthcare Apps Fail to Meet NHS Standards - Lack of Regulation Compliance

Published Date: 2021-02-16

Postmortem Analysis
Timeline 1. The software failure incident mentioned in the article happened around the time the article was published, which was on 2021-02-16 [110883].
System 1. Healthcare apps reviewed by Orcha for several NHS trusts [110883]
Responsible Organization 1. App developers who created healthcare apps that did not meet standards, lacked security updates, and did not comply with regulatory requirements [110883].
Impacted Organization 1. Patients using healthcare apps [110883] 2. NHS staff recommending apps to patients [110883]
Software Causes 1. Lack of security updates in the app to help smokers quit for more than two years [110883] 2. Apps offering complex medical support without expert input [110883] 3. Physiotherapy app offering exercise plans without professional input [110883]
Non-software Causes 1. Lack of security updates in healthcare apps [110883] 2. Insufficient awareness of regulatory requirements by app developers [110883] 3. Difficulty for healthcare professionals to know which tools to recommend to patients [110883]
Impacts 1. Lack of security updates in healthcare apps, such as a smoking cessation app, for over two years, leading to potential vulnerabilities and risks for users [110883]. 2. Apps offering medical support without expert input, potentially providing inaccurate or harmful information to users [110883]. 3. Failure to meet regulatory requirements, such as the need for a CE quality mark for apps acting as medical devices, leading to potential legal and safety issues [110883]. 4. Difficulty for healthcare professionals in identifying and recommending reliable health apps to patients due to the lack of clear standards and regulations [110883].
Preventions 1. Regular security updates and maintenance could have prevented the software failure incident in the healthcare apps reviewed by Orcha [110883]. 2. Ensuring compliance with regulatory requirements, such as obtaining necessary quality marks and registrations from national regulators, could have prevented the software failure incident [110883]. 3. Conducting thorough research and involving a large sample of users during the app development process could have helped prevent software failure incidents in healthcare apps [110883].
Fixes 1. Implement stricter regulations and oversight for healthcare apps to ensure they meet standards and requirements, such as the need for a CE quality mark for medical devices [110883]. 2. Provide clearer guidelines and information for app developers regarding regulatory requirements and necessary quality standards for healthcare apps [110883]. 3. Encourage collaboration between healthcare professionals, app developers, and regulatory bodies to ensure that apps are properly vetted and meet healthcare standards [110883]. 4. Conduct thorough research and testing, involving a large sample of users, to ensure that healthcare apps are effective, safe, and user-friendly [110883].
References 1. Orcha chief executive Liz Ashall-Payne 2. Dr. Jermaine Ravalier from Bath Spa University 3. NHSX digital team 4. Apple and Google review processes 5. Government regulator MHRA 6. National regulators such as the Care Quality Commission, Healthcare Inspectorate, Healthcare Improvement, and Regulation and Quality Improvement Authority

Software Taxonomy of Faults

Category Option Rationale
Recurring multiple_organization (a) The software failure incident having happened again at one_organization: The article does not provide specific information about a similar incident happening again within the same organization or with its products and services. Therefore, it is unknown if a similar incident has occurred again at one organization. (b) The software failure incident having happened again at multiple_organization: The article mentions that Orcha, a firm that reviews healthcare apps for several NHS trusts, found that 80% of the healthcare apps they reviewed did not meet their standards. This indicates that the issue of healthcare apps not meeting standards is prevalent across multiple organizations or developers who have created these apps [110883].
Phase (Design/Operation) design, operation (a) The article mentions examples of software failures related to the design phase, where apps were found to have poor information, lack of security updates, and insufficient awareness of regulatory requirements [110883]. (b) The article also highlights software failures related to the operation phase, where an app to help smokers quit had not received security updates in over two years, indicating a failure in maintaining and operating the system properly [110883].
Boundary (Internal/External) within_system (a) within_system: The software failure incident reported in the article is primarily due to factors originating from within the system. The failure of healthcare apps to meet NHS standards is attributed to issues such as poor information, lack of security updates, insufficient awareness of regulatory requirements, and offering services without necessary expertise or professional input [110883]. These are all internal factors related to the development and maintenance of the apps themselves, indicating a within-system failure.
Nature (Human/Non-human) non-human_actions, human_actions (a) The article mentions a software failure incident related to non-human actions, specifically in the context of healthcare apps not meeting NHS standards. The failures include poor information, lack of security updates, and insufficient awareness of regulatory requirements in the reviewed apps [110883]. These issues point to failures introduced without direct human participation, such as lack of proper maintenance or oversight. (b) The article also highlights a software failure incident related to human actions. It mentions that developers may unintentionally fail to meet regulatory requirements due to a lack of understanding, leading to apps not meeting necessary standards [110883]. Additionally, the article discusses the importance of thorough research and design in app development, indicating that human actions, such as inadequate research or design, can contribute to software failures [110883].
Dimension (Hardware/Software) software (a) The articles do not provide information about a software failure incident occurring due to contributing factors originating in hardware [110883]. (b) The software failure incidents mentioned in the article are primarily due to contributing factors originating in software. Examples include a diabetes management app offering complex medical support without expert backup, a physiotherapy app providing exercise plans without professional input, and a quit-smoking app lacking security updates [110883].
Objective (Malicious/Non-malicious) non-malicious (a) In the article, it is mentioned that a diabetes management app was offering complex medical support without any back-up from experts, a physiotherapy app was offering exercise plans without visible input from professionals, and an app to help smokers quit had not received security updates in more than two years [110883]. These instances indicate non-malicious software failures where the contributing factors were not introduced with the intent to harm the system but rather due to poor development practices or lack of expertise. (b) The article does not provide any specific information about a software failure incident related to malicious intent introduced by humans to harm the system.
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident related to poor_decisions: - The article highlights that many healthcare apps reviewed by Orcha fail to meet standards due to poor information, lack of security updates, and insufficient awareness of regulatory requirements [110883]. - Developers may unintentionally fail to meet regulations because they may not realize what is required, leading to software failures [110883]. - Liz Ashall-Payne, the chief executive of Orcha, mentions that innovators can get a bad reputation due to unintentional failures, even if they have good intentions [110883]. (b) The intent of the software failure incident related to accidental_decisions: - The article mentions that developers may unintentionally fail to meet regulations because they may not realize what is required, leading to software failures [110883]. - Liz Ashall-Payne, the chief executive of Orcha, discusses how developers with good intentions may not know which regulations their products need to comply with, resulting in accidental failures [110883].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) In the article, it is mentioned that a diabetes management app offered complex medical support without any back-up from experts, a physiotherapy app offered exercise plans without visible input from professionals, and an app to help smokers quit had not received security updates in more than two years. These examples highlight failures due to development incompetence, where apps were lacking professional expertise and updates necessary for their intended functions [110883]. (b) The article also discusses how developers may unintentionally fail to meet regulatory requirements, such as the need for a CE quality mark for medical devices or registration with national regulators. This lack of awareness or oversight can lead to accidental failures in compliance with regulations, even if the developers had good intentions but were not fully informed about the necessary standards [110883].
Duration unknown The articles do not provide specific information about a software failure incident being permanent or temporary.
Behaviour crash, omission, value (a) crash: The article mentions examples of poor healthcare apps, such as a diabetes management app offering complex medical support without any back-up from experts, a physiotherapy app offering exercise plans without visible input from professionals, and an app to help smokers quit that had not received security updates in over two years. These examples suggest a potential for crashes where the system may lose state and fail to perform its intended functions [110883]. (b) omission: The article highlights that 80% of healthcare apps reviewed by Orcha did not meet its standards, with failings including poor information, lack of security updates, and insufficient awareness of regulatory requirements. This indicates instances where the apps omitted to perform their intended functions at certain instances [110883]. (c) timing: The article does not specifically mention any instances of timing failures related to the software failure incident [110883]. (d) value: The examples provided in the article, such as a diabetes management app offering complex medical support without expert back-up and a physiotherapy app offering exercise plans without professional input, suggest potential value failures where the system may perform its intended functions incorrectly [110883]. (e) byzantine: The article does not mention any instances of byzantine failures related to the software failure incident [110883]. (f) other: The article does not provide information on any other specific behaviors of the software failure incident beyond crashes, omissions, and value failures [110883].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence theoretical_consequence, other (a) death: People lost their lives due to the software failure - unknown (b) harm: People were physically harmed due to the software failure - unknown (c) basic: People's access to food or shelter was impacted because of the software failure - unknown (d) property: People's material goods, money, or data was impacted due to the software failure - unknown (e) delay: People had to postpone an activity due to the software failure - unknown (f) non-human: Non-human entities were impacted due to the software failure - unknown (g) no_consequence: There were no real observed consequences of the software failure - The article does not mention any specific real observed consequences of the software failures discussed. (h) theoretical_consequence: There were potential consequences discussed of the software failure that did not occur - The article discusses the potential consequences of healthcare apps not meeting standards, such as offering complex medical support without expert backup, providing exercise plans without professional input, and lacking security updates. These issues could potentially lead to harm or negative outcomes for users, but the article does not provide specific instances where these consequences have occurred. (i) other: Was there consequence(s) of the software failure not described in the (a to h) options? What is the other consequence(s)? - The article highlights the potential consequences of healthcare apps not meeting standards, including regulatory issues, lack of expert input, and outdated security measures. These issues could lead to ineffective or even harmful outcomes for users relying on these apps for healthcare support.
Domain health (a) The failed system was related to the health industry. The article discusses how a firm reviewing healthcare apps for NHS trusts found that 80% of the apps did not meet the required standards, highlighting issues such as poor information, lack of security updates, and insufficient awareness of regulatory requirements in healthcare apps [Article 110883].

Sources

Back to List