Recurring |
one_organization, multiple_organization |
(a) The software failure incident related to facial recognition technology being fooled by a 3D-printed mask has happened again within the same organization. The article mentions that earlier in the year, it was found that a 3D-printed head can trick smartphones' facial recognition technology into unlocking the phone. Specifically, Android models were found to be the least secure, with some devices opening by simply showing a photograph of the owner. This incident highlights a recurring issue with the security of facial recognition technology within the same organization that developed the technology, as mentioned in Article 93044.
(b) The software failure incident related to facial recognition technology being fooled by a 3D-printed mask has also happened at multiple organizations. The article mentions that researchers found the technology could be fooled by using a 3D-printed mask depicting a different person's face, and this mask was able to trick payment systems at various locations, including a border checkpoint in China, a passport-control gate in Amsterdam, stores in Asia using payment systems like AliPay and WeChat, self-boarding terminals in Amsterdam, and rail stations in China. This indicates that the security flaw in facial recognition technology is not limited to a single organization but is an industry-wide issue with substandard facial recognition tech, as highlighted by the researchers at Kneron, as mentioned in Article 93044. |
Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase can be seen in the article where researchers found a security flaw in facial recognition technology. They discovered that the technology can be fooled by using a 3D-printed mask depicting a different person's face, allowing individuals to bypass security checkpoints at border checkpoints and passport-control gates [93044].
(b) The software failure incident related to the operation phase is evident in the same article where researchers conducted tests in public locations where facial recognition is used. They were able to fool payment systems and gain access to self-boarding terminals and rail stations by exploiting the vulnerabilities in the facial recognition technology being operated in these locations [93044]. |
Boundary (Internal/External) |
within_system |
(a) The software failure incident reported in the articles is primarily within_system. The facial recognition technology's security flaw, which allowed it to be fooled by a 3D-printed mask depicting a different person's face, originated from within the system itself. Researchers found that the technology could be tricked at various locations, including border checkpoints, passport-control gates, payment systems, and self-boarding terminals, highlighting the vulnerability of the facial recognition technology [93044]. |
Nature (Human/Non-human) |
non-human_actions |
(a) The software failure incident occurring due to non-human actions:
Researchers found that facial recognition technology can be fooled by using a 3D-printed mask depicting a different person's face, which led to the security flaw being discovered [93044]. The technology's limitations were exposed during experiments conducted by Kneron to develop its own facial recognition technology. Additionally, it was found that a 3D-printed head could trick smartphone facial recognition technology into unlocking the phone, highlighting vulnerabilities in the system [93044].
(b) The software failure incident occurring due to human actions:
The article does not specifically mention any software failure incident occurring due to contributing factors introduced by human actions. |
Dimension (Hardware/Software) |
hardware, software |
(a) The software failure incident related to hardware:
- The article reports on a software failure incident where facial recognition technology was fooled by using a 3D-printed mask depicting a different person's face. This incident occurred at a border checkpoint in China and a passport-control gate in Amsterdam [93044].
- The security flaw in the facial recognition technology was discovered by researchers with the artificial intelligence firm Kneron. They found that criminals only need a lifelike mask of a person to bypass security checkpoints, highlighting a vulnerability in the hardware components of the facial recognition systems [93044].
(b) The software failure incident related to software:
- The article mentions that the facial recognition technology's security flaw was due to the software's inability to accurately distinguish between a real face and a 3D-printed mask. This indicates a software failure in the algorithm or programming of the facial recognition system [93044].
- Kneron CEO Albert Liu emphasized that technology providers should be held accountable if they do not safeguard users to the highest standards, indicating a software failure in ensuring the security and integrity of the facial recognition technology [93044]. |
Objective (Malicious/Non-malicious) |
malicious |
(a) The software failure incident reported in the articles is malicious in nature. Researchers found that facial recognition technology can be fooled by using a 3D-printed mask depicting a different person's face, allowing individuals to bypass security checkpoints at border checkpoints and passport-control gates [Article 93044]. This indicates that the failure was due to contributing factors introduced by humans with the intent to harm the system, as criminals could exploit the security flaw to deceive the technology for unauthorized access. |
Intent (Poor/Accidental Decisions) |
poor_decisions |
(a) The intent of the software failure incident was poor_decisions. The failure of the facial recognition technology to accurately identify individuals was attributed to poor decisions made by technology providers who did not safeguard users to the highest standards. The CEO of Kneron, Albert Liu, highlighted an industry-wide issue with substandard facial recognition technology, indicating that companies involved were taking shortcuts at the expense of security [93044]. |
Capability (Incompetence/Accidental) |
development_incompetence, accidental |
(a) The software failure incident related to development incompetence is evident in the article where researchers found a security flaw in facial recognition technology that allowed a 3D-printed mask to trick payment systems at border checkpoints and other public locations [93044]. The CEO of the artificial intelligence firm Kneron highlighted an industry-wide issue with substandard facial recognition technology, emphasizing that technology providers should be held accountable for not safeguarding users to the highest standards. The article also mentions that firms have not upgraded the technology to fix these issues, indicating a lack of professional competence in ensuring the security of the facial recognition systems.
(b) The software failure incident related to accidental factors is demonstrated in the article where researchers accidentally discovered that a 3D-printed mask could bypass security checkpoints using facial recognition technology [93044]. The incident was not intentional but rather a result of testing the limitations of the technology while developing their own facial recognition technology. Additionally, the article mentions concerns over hackers and police potentially gaining access to personal information stored on smartphones due to vulnerabilities in facial recognition systems, highlighting accidental security loopholes that could compromise user privacy. |
Duration |
temporary |
The software failure incident described in the articles can be categorized as a temporary failure. The articles discuss how researchers were able to fool facial recognition systems using 3D-printed masks depicting different faces, highlighting a flaw in the technology. This incident was not a permanent failure as it was caused by specific circumstances, such as the use of lifelike masks, rather than being a fundamental flaw in the technology itself. The incident demonstrated a vulnerability that could be addressed and fixed by upgrading the technology to enhance security measures [93044]. |
Behaviour |
omission, value, other |
(a) crash: The articles do not mention any specific instances of a system crash where the software completely loses state and fails to perform any of its intended functions.
(b) omission: The software failure incident related to facial recognition technology being fooled by a 3D-printed mask can be categorized under omission. The system omitted to perform its intended function of accurately recognizing the correct individual's face, allowing unauthorized access [93044].
(c) timing: There is no indication in the articles that the software failure incident was related to timing issues where the system performed its intended functions but at incorrect times.
(d) value: The software failure incident can be associated with a value failure as the facial recognition technology incorrectly identified the 3D-printed mask as the legitimate user, leading to a security breach [93044].
(e) byzantine: The articles do not describe the software failure incident as exhibiting byzantine behavior with inconsistent responses or interactions.
(f) other: The behavior of the software failure incident can be described as a security vulnerability or flaw in the facial recognition technology, allowing it to be easily tricked by a 3D-printed mask, which is not explicitly covered by the options provided. |