Incident: Apple Suspends Siri Grading Program Due to Privacy Concerns

Published Date: 2019-08-02

Postmortem Analysis
Timeline 1. The software failure incident of Apple suspending its practice of having human contractors listen to users' Siri recordings happened in August 2019 [Article 89011].
System 1. Siri grading system [89011]
Responsible Organization 1. Apple [Article 89011]
Impacted Organization 1. Apple [89011]
Software Causes 1. The software failure incident was caused by the practice of having human contractors listen to users' Siri recordings to "grade" them, which led to the exposure of confidential and private information [Article 89011].
Non-software Causes 1. Lack of transparency and disclosure by Apple regarding the practice of human contractors listening to Siri recordings without explicit user consent [Article 89011].
Impacts 1. Apple suspended its practice of having human contractors listen to users' Siri recordings for grading following a report revealing the practice, impacting the quality assurance process [Article 89011]. 2. Users were unknowingly having their confidential and private information heard by contractors, including in-progress drug deals, medical details, and intimate conversations, leading to privacy concerns [Article 89011]. 3. The suspension of the grading program affected the contractors working for Apple in Ireland, as they were sent home for the weekend without clarity on the future of their employment, causing uncertainty [Article 89011]. 4. The incident highlighted the lack of explicit disclosure by Apple regarding the human involvement in listening to Siri recordings, raising questions about user consent and privacy [Article 89011]. 5. The software failure incident also shed light on similar practices by other tech giants like Google and Amazon, leading to concerns about the privacy implications of voice assistant technologies [Article 89011].
Preventions 1. Implementing a more robust privacy policy and ensuring transparency with users regarding the handling of their data could have prevented the software failure incident. Users should have been explicitly informed that human contractors would be listening to Siri recordings for quality assurance purposes, allowing them to make an informed decision about participating in such a program [Article 89011]. 2. Conducting thorough reviews and risk assessments of the quality assurance programs before their implementation could have identified potential privacy concerns and prevented the inadvertent exposure of confidential information [Article 89011].
Fixes 1. Conducting a thorough review of the practice of having human contractors listen to users' Siri recordings to "grade" them and ensuring user privacy protection before restarting the program [Article 89011]. 2. Adding the ability for users to opt out of the quality assurance scheme altogether in a future software update [Article 89011]. 3. Implementing appropriate measures to ensure compliance with data protection regulations, such as the general data protection regulation (GDPR) in the EU [Article 89011].
References 1. Whistleblower within Apple's contractors [Article 89011]

Software Taxonomy of Faults

Category Option Rationale
Recurring multiple_organization (a) The software failure incident related to privacy breaches and human contractors listening to users' recordings has happened at multiple organizations, not just Apple. The article mentions that both Amazon and Google also use contractors to check the quality of their voice assistants, and contractors from both companies have expressed discomfort at the nature of overheard recordings [Article 89011]. This indicates that the issue of privacy breaches and human contractors listening to recordings is not unique to Apple but is a broader industry practice.
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the case of Apple's Siri grading program. The incident occurred due to the practice of human contractors listening to users' Siri recordings to "grade" them. This design flaw led to contractors regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, which were accidentally triggered by the Siri digital assistant [Article 89011]. (b) The software failure incident related to the operation phase can be observed in the case of the accidental triggers of Siri recordings. The whistleblower mentioned that accidental triggers, especially on the Apple Watch, were incredibly high, leading to snippets of confidential information being recorded without the users' knowledge. This operation-related failure resulted in the system recording sensitive conversations and activities without explicit user consent or awareness [Article 89011].
Boundary (Internal/External) within_system, outside_system (a) within_system: The software failure incident related to Apple's Siri grading program was primarily due to factors originating from within the system. Apple had human contractors listening to Siri recordings for grading purposes, which led to the exposure of confidential and private information, including accidental triggers of sensitive conversations like drug deals, medical details, and intimate moments [Article 89011]. (b) outside_system: The software failure incident also involved external factors related to privacy concerns and regulatory oversight. The incident prompted discussions about the lack of explicit disclosure to users regarding human contractors listening to Siri recordings, raising questions about privacy violations and data protection regulations such as GDPR [Article 89011]. Additionally, the data protection commissioner for Hamburg in Germany took action against Google for a similar program, highlighting the external regulatory scrutiny faced by companies like Apple, Google, and Amazon in the context of privacy and data protection laws [Article 89011].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident in the articles is related to non-human_actions, specifically accidental triggers of the Siri digital assistant leading to the recording of confidential and private information. Contractors working for Apple mentioned that the Apple Watch was particularly susceptible to accidental triggers, capturing snippets of conversations including medical details, drug deals, and people engaging in sexual acts [89011]. (b) The software failure incident also involves human_actions, as it was revealed that human contractors were listening to a random selection of Siri recordings, including those triggered accidentally, as part of a quality assurance program. This human action of listening to and grading Siri recordings without explicit user consent led to the exposure of confidential information and privacy concerns [89011].
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The article mentions that contractors working for Apple in Ireland were sent home for the weekend after being told the system they used for grading "was not working" globally. Only managers were asked to stay on site, indicating a hardware-related issue with the system used for grading [89011]. (b) The software failure incident related to software: - The suspension of Apple's Siri grading programme was due to the revelation that contractors were regularly hearing confidential and private information while carrying out the grading process. This indicates a software failure in the design or implementation of the Siri grading software that allowed for the inadvertent recording and sharing of sensitive information [89011].
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident reported in the articles is non-malicious. The incident involved Apple suspending its practice of having human contractors listen to users' Siri recordings for grading purposes following a report revealing that contractors were regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts accidentally recorded by Siri [Article 89011]. This failure was not due to malicious intent but rather a lack of proper disclosure and control over the handling of sensitive user data.
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident related to poor decisions can be seen in the case of Apple's Siri grading program. Apple had human contractors listen to users' Siri recordings to "grade" them for quality assurance purposes. This practice led to contractors regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, all recorded through accidental triggers of the Siri digital assistant [89011]. (b) The intent of the software failure incident related to accidental decisions is evident in the fact that the confidential information was recorded through accidental triggers of the Siri digital assistant. The whistleblower mentioned that accidental triggers on devices like the Apple Watch were incredibly high, leading to snippets of conversations being recorded unintentionally. This unintentional recording of sensitive information was a result of accidental decisions made by the devices, causing a breach of user privacy [89011].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident related to development incompetence is evident in the article as Apple, Google, and Amazon were criticized for their undisclosed quality assurance programs where human contractors were listening to users' voice assistant recordings. The contractors were reported to have heard confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, which were accidentally triggered by the voice assistants [Article 89011]. (b) The software failure incident related to accidental factors is also highlighted in the article, where accidental triggers of the voice assistants, particularly on the Apple Watch, led to the recording of snippets of private conversations and activities without the users' knowledge. The accidental triggers were so common that contractors reported hearing sensitive information like medical histories, drug deals, and sexual acts, which were not intended to be recorded [Article 89011].
Duration temporary The software failure incident described in the articles can be categorized as a temporary failure. This is evident from the fact that Apple suspended its practice of having human contractors listen to users' Siri recordings following a report revealing the practice [Article 89011]. The suspension was prompted by concerns raised in the report about contractors overhearing confidential and private information, leading to Apple conducting a thorough review of the practice and committing to adding the ability for users to opt out of the quality assurance scheme in a future software update. This indicates that the failure was temporary and triggered by specific circumstances related to the privacy concerns raised in the report.
Behaviour other (a) crash: The incident reported in the articles does not involve a crash where the system loses state and does not perform any of its intended functions [Article 89011]. (b) omission: The incident does not involve a failure due to the system omitting to perform its intended functions at an instance(s) [Article 89011]. (c) timing: The incident does not involve a failure due to the system performing its intended functions correctly, but too late or too early [Article 89011]. (d) value: The incident does not involve a failure due to the system performing its intended functions incorrectly [Article 89011]. (e) byzantine: The incident does not involve the system behaving erroneously with inconsistent responses and interactions [Article 89011]. (f) other: The behavior of the software failure incident reported in the articles is related to a breach of user privacy and trust due to human contractors listening to confidential and private information from Siri recordings without explicit user consent, rather than a technical failure in the software itself [Article 89011].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence harm, property, other (a) death: There is no mention of people losing their lives due to the software failure incident in the provided article [89011]. (b) harm: The article mentions that confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, was overheard by Apple contractors during the grading process of Siri recordings. This invasion of privacy could be considered as harm caused by the software failure incident [89011]. (c) basic: There is no indication in the article that people's access to food or shelter was impacted due to the software failure incident [89011]. (d) property: The software failure incident led to the exposure of confidential information, which could be considered as an impact on people's data privacy [89011]. (e) delay: The article does not mention any delays caused by the software failure incident [89011]. (f) non-human: The software failure incident primarily affected human users through the exposure of private information and invasion of privacy. There is no specific mention of non-human entities being impacted [89011]. (g) no_consequence: The software failure incident had real observed consequences, particularly related to privacy violations and the suspension of the grading program [89011]. (h) theoretical_consequence: The article discusses potential consequences of privacy violations and data protection breaches resulting from the software failure incident. However, these potential consequences were not just theoretical as they led to the suspension of the grading program and raised concerns about privacy practices [89011]. (i) other: The software failure incident also highlighted concerns about the lack of transparency regarding the use of human contractors to listen to Siri recordings and the potential misuse of personal data collected through voice assistants [89011].
Domain information, health (a) The software failure incident reported in the articles is related to the industry of information. The incident involved Apple's Siri grading program, where human contractors were listening to users' Siri recordings to assess and improve the quality of the service. This incident raised concerns about privacy violations and the inadvertent collection of sensitive information during the grading process [Article 89011]. (b) There is no direct mention of the transportation industry in the articles. (c) There is no direct mention of the natural resources industry in the articles. (d) There is no direct mention of the sales industry in the articles. (e) There is no direct mention of the construction industry in the articles. (f) There is no direct mention of the manufacturing industry in the articles. (g) There is no direct mention of the utilities industry in the articles. (h) There is no direct mention of the finance industry in the articles. (i) There is no direct mention of the knowledge industry in the articles. (j) The software failure incident is indirectly related to the health industry as the Siri grading program incident involved contractors overhearing confidential information, including medical details, during the grading process [Article 89011]. (k) There is no direct mention of the entertainment industry in the articles. (l) There is no direct mention of the government industry in the articles. (m) The software failure incident is not directly related to any other industry mentioned in the options.

Sources

Back to List