Incident: Facial Recognition Technology Vulnerability Exposed by 3D-Printed Mask

Published Date: 2019-12-16

Postmortem Analysis
Timeline 1. The software failure incident of facial recognition being fooled by a 3D-printed mask depicting a different person's face happened before December 16, 2019, as the article discussing the incident was published on that date [93044].
System The software failure incident reported in Article 93044 highlighted a failure in facial recognition technology. Specifically, the following systems/components failed: 1. Facial recognition technology used in payment systems at a border checkpoint in China and a passport-control gate in Amsterdam [93044]. 2. Facial recognition technology used in self-boarding terminals in Amsterdam [93044]. 3. Facial recognition technology used in rail stations in China for fare payment and boarding trains [93044]. 4. Android models' facial recognition technology, including Galaxy Note 8, Galaxy S9, LG G7 Thinq, and OnePlus6, which were found to be less secure compared to Apple's iPhone X models [93044].
Responsible Organization 1. Researchers at the artificial intelligence firm Kneron discovered the security flaw in facial recognition technology [93044]. 2. Companies involved in developing and implementing substandard facial recognition technology were also responsible for the incident [93044].
Impacted Organization 1. Border checkpoint in China 2. Passport-control gate in Amsterdam 3. Payment systems such as AliPay and WeChat 4. Self-boarding terminal in Amsterdam 5. Rail stations in China 6. Smartphone facial recognition technology, particularly Android models 7. Personal information stored on handsets 8. Users' privacy and security 9. Firms involved in providing substandard facial recognition technology 10. Industry-wide issue with facial recognition technology providers [Cited from Article 93044]
Software Causes 1. Substandard facial recognition technology being vulnerable to being fooled by a 3D-printed mask depicting a different person's face [93044] 2. Lack of security measures in facial recognition systems used at border checkpoints, passport-control gates, payment systems, self-boarding terminals, and rail stations [93044] 3. Failure of some Android smartphone models to provide secure facial recognition compared to Apple's iPhone X models [93044]
Non-software Causes 1. The use of a 3D-printed mask depicting a different person's face to trick the facial recognition technology [93044]. 2. Lack of proper safeguards and standards by technology providers in ensuring the security of users [93044]. 3. Industry-wide issue with substandard facial recognition technology [93044]. 4. Security flaw in the facial recognition technology discovered by researchers [93044]. 5. Concerns over the vulnerability of personal information stored on smartphones due to facial recognition technology [93044].
Impacts 1. The software failure incident involving facial recognition technology being fooled by a 3D-printed mask had significant impacts on security at border checkpoints in China and Amsterdam [93044]. 2. The incident raised concerns about the industry-wide issue of substandard facial recognition technology, highlighting the need for technology providers to be held accountable for safeguarding users to the highest standards [93044]. 3. The failure of facial recognition technology to accurately identify individuals led to potential privacy threats for users, especially in public locations where facial recognition is commonly used [93044]. 4. The incident demonstrated the limitations of current facial recognition technology, prompting the need for upgrades and improvements to enhance security measures [93044].
Preventions 1. Implementing multi-factor authentication alongside facial recognition technology could have prevented the software failure incident by adding an extra layer of security [93044]. 2. Regularly updating and upgrading the facial recognition technology to address vulnerabilities and security flaws could have prevented the incident [93044]. 3. Conducting thorough testing and validation of the facial recognition technology in various real-world scenarios to identify and mitigate potential weaknesses could have prevented the incident [93044].
Fixes 1. Implementing more advanced facial recognition technology that can detect and differentiate between real faces and 3D-printed masks [93044]. 2. Regularly updating and upgrading facial recognition systems to address security vulnerabilities and flaws [93044]. 3. Holding technology providers accountable for safeguarding users to the highest standards and ensuring the security of facial recognition technology [93044].
References 1. Researchers at the artificial intelligence firm Kneron [93044] 2. Kneron CEO Albert Liu [93044] 3. Forbes reporter Thomas Brewster [93044]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization, multiple_organization (a) The software failure incident related to facial recognition technology being fooled by a 3D-printed mask has happened again within the same organization. The article mentions that earlier in the year, it was found that a 3D-printed head can trick smartphones' facial recognition technology into unlocking the phone. Specifically, Android models were found to be the least secure, with some devices opening by simply showing a photograph of the owner. This incident highlights a recurring issue with the security of facial recognition technology within the same organization that developed the technology, as mentioned in Article 93044. (b) The software failure incident related to facial recognition technology being fooled by a 3D-printed mask has also happened at multiple organizations. The article mentions that researchers found the technology could be fooled by using a 3D-printed mask depicting a different person's face, and this mask was able to trick payment systems at various locations, including a border checkpoint in China, a passport-control gate in Amsterdam, stores in Asia using payment systems like AliPay and WeChat, self-boarding terminals in Amsterdam, and rail stations in China. This indicates that the security flaw in facial recognition technology is not limited to a single organization but is an industry-wide issue with substandard facial recognition tech, as highlighted by the researchers at Kneron, as mentioned in Article 93044.
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where researchers found a security flaw in facial recognition technology. They discovered that the technology can be fooled by using a 3D-printed mask depicting a different person's face, allowing individuals to bypass security checkpoints at border checkpoints and passport-control gates [93044]. (b) The software failure incident related to the operation phase is evident in the same article where researchers conducted tests in public locations where facial recognition is used. They were able to fool payment systems and gain access to self-boarding terminals and rail stations by exploiting the vulnerabilities in the facial recognition technology being operated in these locations [93044].
Boundary (Internal/External) within_system (a) The software failure incident reported in the articles is primarily within_system. The facial recognition technology's security flaw, which allowed it to be fooled by a 3D-printed mask depicting a different person's face, originated from within the system itself. Researchers found that the technology could be tricked at various locations, including border checkpoints, passport-control gates, payment systems, and self-boarding terminals, highlighting the vulnerability of the facial recognition technology [93044].
Nature (Human/Non-human) non-human_actions (a) The software failure incident occurring due to non-human actions: Researchers found that facial recognition technology can be fooled by using a 3D-printed mask depicting a different person's face, which led to the security flaw being discovered [93044]. The technology's limitations were exposed during experiments conducted by Kneron to develop its own facial recognition technology. Additionally, it was found that a 3D-printed head could trick smartphone facial recognition technology into unlocking the phone, highlighting vulnerabilities in the system [93044]. (b) The software failure incident occurring due to human actions: The article does not specifically mention any software failure incident occurring due to contributing factors introduced by human actions.
Dimension (Hardware/Software) hardware, software (a) The software failure incident related to hardware: - The article reports on a software failure incident where facial recognition technology was fooled by using a 3D-printed mask depicting a different person's face. This incident occurred at a border checkpoint in China and a passport-control gate in Amsterdam [93044]. - The security flaw in the facial recognition technology was discovered by researchers with the artificial intelligence firm Kneron. They found that criminals only need a lifelike mask of a person to bypass security checkpoints, highlighting a vulnerability in the hardware components of the facial recognition systems [93044]. (b) The software failure incident related to software: - The article mentions that the facial recognition technology's security flaw was due to the software's inability to accurately distinguish between a real face and a 3D-printed mask. This indicates a software failure in the algorithm or programming of the facial recognition system [93044]. - Kneron CEO Albert Liu emphasized that technology providers should be held accountable if they do not safeguard users to the highest standards, indicating a software failure in ensuring the security and integrity of the facial recognition technology [93044].
Objective (Malicious/Non-malicious) malicious (a) The software failure incident reported in the articles is malicious in nature. Researchers found that facial recognition technology can be fooled by using a 3D-printed mask depicting a different person's face, allowing individuals to bypass security checkpoints at border checkpoints and passport-control gates [Article 93044]. This indicates that the failure was due to contributing factors introduced by humans with the intent to harm the system, as criminals could exploit the security flaw to deceive the technology for unauthorized access.
Intent (Poor/Accidental Decisions) poor_decisions (a) The intent of the software failure incident was poor_decisions. The failure of the facial recognition technology to accurately identify individuals was attributed to poor decisions made by technology providers who did not safeguard users to the highest standards. The CEO of Kneron, Albert Liu, highlighted an industry-wide issue with substandard facial recognition technology, indicating that companies involved were taking shortcuts at the expense of security [93044].
Capability (Incompetence/Accidental) development_incompetence, accidental (a) The software failure incident related to development incompetence is evident in the article where researchers found a security flaw in facial recognition technology that allowed a 3D-printed mask to trick payment systems at border checkpoints and other public locations [93044]. The CEO of the artificial intelligence firm Kneron highlighted an industry-wide issue with substandard facial recognition technology, emphasizing that technology providers should be held accountable for not safeguarding users to the highest standards. The article also mentions that firms have not upgraded the technology to fix these issues, indicating a lack of professional competence in ensuring the security of the facial recognition systems. (b) The software failure incident related to accidental factors is demonstrated in the article where researchers accidentally discovered that a 3D-printed mask could bypass security checkpoints using facial recognition technology [93044]. The incident was not intentional but rather a result of testing the limitations of the technology while developing their own facial recognition technology. Additionally, the article mentions concerns over hackers and police potentially gaining access to personal information stored on smartphones due to vulnerabilities in facial recognition systems, highlighting accidental security loopholes that could compromise user privacy.
Duration temporary The software failure incident described in the articles can be categorized as a temporary failure. The articles discuss how researchers were able to fool facial recognition systems using 3D-printed masks depicting different faces, highlighting a flaw in the technology. This incident was not a permanent failure as it was caused by specific circumstances, such as the use of lifelike masks, rather than being a fundamental flaw in the technology itself. The incident demonstrated a vulnerability that could be addressed and fixed by upgrading the technology to enhance security measures [93044].
Behaviour omission, value, other (a) crash: The articles do not mention any specific instances of a system crash where the software completely loses state and fails to perform any of its intended functions. (b) omission: The software failure incident related to facial recognition technology being fooled by a 3D-printed mask can be categorized under omission. The system omitted to perform its intended function of accurately recognizing the correct individual's face, allowing unauthorized access [93044]. (c) timing: There is no indication in the articles that the software failure incident was related to timing issues where the system performed its intended functions but at incorrect times. (d) value: The software failure incident can be associated with a value failure as the facial recognition technology incorrectly identified the 3D-printed mask as the legitimate user, leading to a security breach [93044]. (e) byzantine: The articles do not describe the software failure incident as exhibiting byzantine behavior with inconsistent responses or interactions. (f) other: The behavior of the software failure incident can be described as a security vulnerability or flaw in the facial recognition technology, allowing it to be easily tricked by a 3D-printed mask, which is not explicitly covered by the options provided.

IoT System Layer

Layer Option Rationale
Perception sensor, embedded_software (a) sensor: The software failure incident reported in the articles is related to the sensor layer of the cyber physical system. The facial recognition technology was fooled by using a 3D-printed mask depicting a different person's face, which tricked payment systems at a border checkpoint in China and a passport-control gate in Amsterdam. This indicates a failure in the sensor's ability to accurately capture and differentiate between real faces and masks [93044]. (e) embedded_software: The failure incident also involves the embedded software layer of the cyber physical system. The article mentions that the facial recognition technology was found to have security flaws that allowed criminals to bypass security checkpoints by using lifelike masks of a person. This suggests a vulnerability or error in the embedded software of the facial recognition technology [93044].
Communication unknown The software failure incident reported in the articles does not specifically mention a failure related to the communication layer of the cyber physical system. The focus of the incident is on the security flaw in facial recognition technology that can be exploited using 3D-printed masks to bypass security checkpoints and unlock devices. Therefore, it is unknown whether the failure was related to the link_level or connectivity_level of the cyber physical system.
Application TRUE The software failure incident described in the articles is related to the application layer of the cyber physical system. The failure was due to a security flaw in the facial recognition technology, which allowed the system to be fooled by a 3D-printed mask depicting a different person's face. This flaw was exploited by researchers to bypass security checkpoints at various locations, including a border checkpoint in China and a passport-control gate in Amsterdam [93044].

Other Details

Category Option Rationale
Consequence no_consequence, theoretical_consequence (a) unknown (b) unknown (c) unknown (d) unknown (e) unknown (f) unknown (g) no_consequence (h) harm: The software failure incident related to facial recognition technology being fooled by a 3D-printed mask did not result in any real observed consequences as mentioned in the article. It highlighted the potential threat to privacy and security but did not mention any actual harm or impact on individuals [93044]. (i) unknown
Domain information, transportation The software failure incident reported in the articles is related to the industry of information (a) and transportation (b). (a) Information Industry: - The facial recognition technology that failed was used in public locations where facial recognition is employed for various purposes, including payment systems like AliPay and WeChat [93044]. - The technology was also used at a self-boarding terminal in Amsterdam and rail stations in China for commuter access [93044]. (b) Transportation Industry: - The facial recognition technology was used at a border checkpoint in China and a passport-control gate in Amsterdam [93044]. - Researchers were able to bypass security checkpoints and gain access to rail stations in China using the flawed facial recognition technology [93044].

Sources

Back to List