Recurring |
multiple_organization |
(a) The software failure incident related to privacy breaches and human contractors listening to users' recordings has happened at multiple organizations, not just Apple. The article mentions that both Amazon and Google also use contractors to check the quality of their voice assistants, and contractors from both companies have expressed discomfort at the nature of overheard recordings [Article 89011]. This indicates that the issue of privacy breaches and human contractors listening to recordings is not unique to Apple but is a broader industry practice. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase can be seen in the case of Apple's Siri grading program. The incident occurred due to the practice of human contractors listening to users' Siri recordings to "grade" them. This design flaw led to contractors regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, which were accidentally triggered by the Siri digital assistant [Article 89011].
(b) The software failure incident related to the operation phase can be observed in the case of the accidental triggers of Siri recordings. The whistleblower mentioned that accidental triggers, especially on the Apple Watch, were incredibly high, leading to snippets of confidential information being recorded without the users' knowledge. This operation-related failure resulted in the system recording sensitive conversations and activities without explicit user consent or awareness [Article 89011]. |
Boundary (Internal/External) |
within_system, outside_system |
(a) within_system: The software failure incident related to Apple's Siri grading program was primarily due to factors originating from within the system. Apple had human contractors listening to Siri recordings for grading purposes, which led to the exposure of confidential and private information, including accidental triggers of sensitive conversations like drug deals, medical details, and intimate moments [Article 89011].
(b) outside_system: The software failure incident also involved external factors related to privacy concerns and regulatory oversight. The incident prompted discussions about the lack of explicit disclosure to users regarding human contractors listening to Siri recordings, raising questions about privacy violations and data protection regulations such as GDPR [Article 89011]. Additionally, the data protection commissioner for Hamburg in Germany took action against Google for a similar program, highlighting the external regulatory scrutiny faced by companies like Apple, Google, and Amazon in the context of privacy and data protection laws [Article 89011]. |
Nature (Human/Non-human) |
non-human_actions, human_actions |
(a) The software failure incident in the articles is related to non-human_actions, specifically accidental triggers of the Siri digital assistant leading to the recording of confidential and private information. Contractors working for Apple mentioned that the Apple Watch was particularly susceptible to accidental triggers, capturing snippets of conversations including medical details, drug deals, and people engaging in sexual acts [89011].
(b) The software failure incident also involves human_actions, as it was revealed that human contractors were listening to a random selection of Siri recordings, including those triggered accidentally, as part of a quality assurance program. This human action of listening to and grading Siri recordings without explicit user consent led to the exposure of confidential information and privacy concerns [89011]. |
Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident related to hardware:
- The article mentions that contractors working for Apple in Ireland were sent home for the weekend after being told the system they used for grading "was not working" globally. Only managers were asked to stay on site, indicating a hardware-related issue with the system used for grading [89011].
(b) The software failure incident related to software:
- The suspension of Apple's Siri grading programme was due to the revelation that contractors were regularly hearing confidential and private information while carrying out the grading process. This indicates a software failure in the design or implementation of the Siri grading software that allowed for the inadvertent recording and sharing of sensitive information [89011]. |
Objective (Malicious/Non-malicious) |
non-malicious |
(a) The software failure incident reported in the articles is non-malicious. The incident involved Apple suspending its practice of having human contractors listen to users' Siri recordings for grading purposes following a report revealing that contractors were regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts accidentally recorded by Siri [Article 89011]. This failure was not due to malicious intent but rather a lack of proper disclosure and control over the handling of sensitive user data. |
Intent (Poor/Accidental Decisions) |
poor_decisions, accidental_decisions |
(a) The intent of the software failure incident related to poor decisions can be seen in the case of Apple's Siri grading program. Apple had human contractors listen to users' Siri recordings to "grade" them for quality assurance purposes. This practice led to contractors regularly hearing confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, all recorded through accidental triggers of the Siri digital assistant [89011].
(b) The intent of the software failure incident related to accidental decisions is evident in the fact that the confidential information was recorded through accidental triggers of the Siri digital assistant. The whistleblower mentioned that accidental triggers on devices like the Apple Watch were incredibly high, leading to snippets of conversations being recorded unintentionally. This unintentional recording of sensitive information was a result of accidental decisions made by the devices, causing a breach of user privacy [89011]. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident related to development incompetence is evident in the article as Apple, Google, and Amazon were criticized for their undisclosed quality assurance programs where human contractors were listening to users' voice assistant recordings. The contractors were reported to have heard confidential and private information, including in-progress drug deals, medical details, and people engaging in sexual acts, which were accidentally triggered by the voice assistants [Article 89011].
(b) The software failure incident related to accidental factors is also highlighted in the article, where accidental triggers of the voice assistants, particularly on the Apple Watch, led to the recording of snippets of private conversations and activities without the users' knowledge. The accidental triggers were so common that contractors reported hearing sensitive information like medical histories, drug deals, and sexual acts, which were not intended to be recorded [Article 89011]. |
Duration |
temporary |
The software failure incident described in the articles can be categorized as a temporary failure. This is evident from the fact that Apple suspended its practice of having human contractors listen to users' Siri recordings following a report revealing the practice [Article 89011]. The suspension was prompted by concerns raised in the report about contractors overhearing confidential and private information, leading to Apple conducting a thorough review of the practice and committing to adding the ability for users to opt out of the quality assurance scheme in a future software update. This indicates that the failure was temporary and triggered by specific circumstances related to the privacy concerns raised in the report. |
Behaviour |
other |
(a) crash: The incident reported in the articles does not involve a crash where the system loses state and does not perform any of its intended functions [Article 89011].
(b) omission: The incident does not involve a failure due to the system omitting to perform its intended functions at an instance(s) [Article 89011].
(c) timing: The incident does not involve a failure due to the system performing its intended functions correctly, but too late or too early [Article 89011].
(d) value: The incident does not involve a failure due to the system performing its intended functions incorrectly [Article 89011].
(e) byzantine: The incident does not involve the system behaving erroneously with inconsistent responses and interactions [Article 89011].
(f) other: The behavior of the software failure incident reported in the articles is related to a breach of user privacy and trust due to human contractors listening to confidential and private information from Siri recordings without explicit user consent, rather than a technical failure in the software itself [Article 89011]. |