Incident: Facial Recognition System Spoofing Using 3-D Rendering Technology

Published Date: 2016-08-19

Postmortem Analysis
Timeline 1. The software failure incident of the facial recognition system spoofing using 3-D rendering and Internet stalking happened in August 2016 as per the article published on August 19, 2016 [46971].
System 1. Facial recognition systems [46971]
Responsible Organization 1. The software failure incident was caused by the researchers from the University of North Carolina who demonstrated a method of stealing faces using 3-D rendering and virtual reality technology [46971].
Impacted Organization 1. Individuals who use face authentication systems on consumer products like laptops and smartphones [46971].
Software Causes 1. The software cause of the failure incident was the vulnerability of facial recognition systems to spoofing attacks using 3-D rendering based on publicly available photos [46971].
Non-software Causes 1. Lack of awareness about online presence and privacy: The incident was caused by the availability of personal photos on social media platforms like Facebook, LinkedIn, and Google+ which were used to create 3-D facial models for the attack [46971]. 2. Vulnerability of biometric data: The incident highlighted the risk associated with using biometric data for authentication, as once compromised or publicly available, it can be exploited [46971]. 3. Insufficient quality of available photos: The limited image resources found online, often low resolution and not depicting full faces, posed a challenge in creating realistic 3-D face replicas for the attack [46971].
Impacts 1. The software failure incident involving the facial recognition systems had the impact of exposing the vulnerability of biometric data, particularly facial biometric data, to theft and exploitation [46971]. 2. The incident highlighted the risk of compromising biometric data that remains constant and can be easily recorded and exploited if publicly available [46971]. 3. The failure incident demonstrated the ability to spoof facial recognition systems using 3-D rendering based on publicly available photos, raising concerns about the security of face authentication systems in consumer products like laptops and smartphones [46971]. 4. The incident emphasized the need for continuous evolution and improvement in consumer face authentication systems to defend against new methods of spoofing, potentially requiring the incorporation of additional hardware and sensors beyond just mobile cameras or webcams [46971].
Preventions 1. Implementing scanning faces for human infrared signals, which wouldn't be reproduced in a VR system, as suggested by Anil Jain, a biometrics researcher at Michigan State University [46971]. 2. Evolving consumer face authentication systems to keep up with new methods of spoofing, potentially by incorporating hardware and sensors beyond just mobile cameras or web cams, as mentioned by UNC's Price [46971].
Fixes 1. Enhancing face authentication systems by scanning faces for human infrared signals, which wouldn't be reproduced in a VR system [46971]. 2. Evolving consumer face authentication systems to keep up with new methods of spoofing, potentially by incorporating hardware and sensors beyond just mobile cameras or webcams [46971].
References 1. Publicly available photos 2. Image search engines 3. Professional photos 4. Social networks like Facebook, LinkedIn, and Google+ 5. Indoor head shots of participants 6. Consumer software vendors like the Google Play Store and the iTunes Store 7. Department of Computer Science/UNC Chapel Hill 8. Google announcements and warnings 9. Biometrics researcher at Michigan State University

Software Taxonomy of Faults

Category Option Rationale
Recurring multiple_organization (a) The software failure incident related to facial recognition spoofing using 3-D rendering technology has happened again at multiple organizations. The researchers from the University of North Carolina demonstrated a method to defeat facial recognition systems by creating 3-D facial models based on publicly available photos and using them to spoof authentication systems [46971]. This incident highlights the vulnerability of facial recognition systems to such attacks and the need for improved security measures to prevent spoofing.
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the article where researchers from the University of North Carolina presented a system that used digital 3-D facial models based on publicly available photos to defeat facial recognition systems. The researchers collected images of volunteers through image search engines, professional photos, and publicly available assets on social networks like Facebook, LinkedIn, and Google+ to create 3-D renders of faces that successfully spoofed four out of five systems they tested [46971]. (b) The software failure incident related to the operation phase is evident in the same article where the researchers tested their virtual reality face renders on five authentication systems, including those available from consumer software vendors like the Google Play Store and the iTunes Store. They were able to trick all five systems in every case they tested using control photos and were successful in tricking four of the systems with public web photos, with success rates ranging from 55 percent to 85 percent [46971].
Boundary (Internal/External) within_system (a) within_system: The software failure incident discussed in the articles is related to facial recognition systems being tricked by 3-D rendering and virtual reality technology based on publicly available photos [46971]. The failure originates from within the system as the facial recognition systems were unable to distinguish between real faces and 3-D renders created from online photos, highlighting a vulnerability in the authentication process. The attack was successful in spoofing four out of the five systems tested, indicating a failure within the system's ability to accurately authenticate individuals based on facial features.
Nature (Human/Non-human) non-human_actions (a) The software failure incident discussed in the articles is primarily related to non-human actions. The failure occurred due to the vulnerability of facial recognition systems to be tricked by 3-D rendering based on publicly available photos and displayed with mobile virtual reality technology. The researchers were able to successfully spoof four out of the five systems they tested using this method, highlighting the risks associated with authenticating identity with biometrics [46971]. The incident showcases how software systems can fail due to factors introduced without direct human participation, such as the exploitation of publicly available data and the limitations of current authentication technologies.
Dimension (Hardware/Software) hardware (a) The software failure incident related to hardware: The articles discuss the potential need for consumer face authentication systems to evolve by incorporating hardware and sensors beyond just mobile cameras or webcams. It is mentioned that some vendors, like Microsoft with its Windows Hello software, already have commercial solutions that leverage alternative hardware. However, there is a challenge in implementing such hardware on mobile devices where space is limited. The discussion revolves around the idea that adding specialized components like IR cameras or structured light projectors may be necessary to enhance security mechanisms but comes with a cost-benefit analysis for hardware vendors [46971]. (b) The software failure incident related to software: The incident discussed in the articles is not directly related to a failure originating in software. Instead, it focuses on the vulnerability of facial recognition systems to spoofing attacks using 3-D face replicas created from publicly available photos. The attack was successful in tricking multiple authentication systems by using digital 3-D facial models based on online images. The incident highlights the risks associated with biometric data being compromised or publicly available, leading to potential exploitation [46971].
Objective (Malicious/Non-malicious) malicious (a) The software failure incident described in the articles is malicious in nature. Researchers from the University of North Carolina presented a system at the Usenix security conference that used digital 3-D facial models based on publicly available photos to defeat facial recognition systems. The attack successfully spoofed four out of the five systems they tried, demonstrating a method of stealing a face through 3-D rendering and Internet stalking [46971]. The incident involved intentionally tricking facial recognition systems, highlighting the malicious intent behind the failure.
Intent (Poor/Accidental Decisions) unknown (a) The intent of the software failure incident: The software failure incident described in the article was not due to poor decisions but rather intentional research conducted by security and computer vision specialists from the University of North Carolina. The researchers demonstrated a method of stealing a face using 3-D rendering and publicly available photos to defeat facial recognition systems. They collected images of volunteers through online sources like image search engines, professional photos, and social networks to create virtual reality face renders that successfully spoofed authentication systems [46971]. This incident was a deliberate attempt to highlight the vulnerabilities of facial recognition systems rather than a result of poor decisions.
Capability (Incompetence/Accidental) accidental (a) The software failure incident reported in the articles can be attributed to development incompetence. The incident involved researchers from the University of North Carolina who demonstrated a method of stealing a face by using digital 3-D facial models based on publicly available photos to defeat facial recognition systems [46971]. The researchers collected images of volunteers through image search engines, professional photos, and publicly available assets on social networks like Facebook, LinkedIn, and Google+ to create virtual reality face renders that successfully spoofed authentication systems [46971]. This highlights the downside of authenticating identity with biometrics when biometric data is compromised or publicly available, emphasizing the risk associated with using facial recognition systems that can be easily tricked due to the lack of robust security measures. (b) The software failure incident can also be considered accidental as it involved researchers accidentally discovering a method to exploit facial recognition systems using 3-D face replicas based on publicly available photos [46971]. The researchers found that even individuals who make an active effort to protect their privacy online had their faces easily replicated and used to trick authentication systems, indicating the accidental vulnerability of such systems to attacks leveraging publicly available data [46971]. This accidental discovery underscores the need for continuous improvement in security measures to prevent unauthorized access and misuse of personal data through facial recognition technology.
Duration temporary The software failure incident discussed in the articles is more aligned with a temporary failure rather than a permanent one. The incident involved a specific attack method that exploited vulnerabilities in facial recognition systems by using 3-D rendering and publicly available photos to trick authentication systems [46971]. The attack was successful in spoofing four out of the five systems tested, indicating a temporary failure caused by certain circumstances, such as the method used by the researchers to create 3-D face replicas and the limitations of the current authentication systems in detecting such spoofing attempts. The articles also mention potential improvements in defense mechanisms against such attacks, suggesting that the failure was not permanent but rather a result of existing vulnerabilities that can be addressed with advancements in technology and security measures.
Behaviour omission, value, other (a) crash: The articles do not mention any software failure incident related to a crash where the system loses state and does not perform any of its intended functions. (b) omission: The software failure incident described in the articles is related to the omission of the system to perform its intended functions. Specifically, the facial recognition systems were tricked by 3-D rendering and virtual reality technology based on publicly available photos, leading to the omission of correctly identifying the individuals [46971]. (c) timing: The articles do not mention any software failure incident related to timing, where the system performs its intended functions correctly but too late or too early. (d) value: The software failure incident described in the articles is related to the system performing its intended functions incorrectly. The facial recognition systems were spoofed by 3-D face replicas, leading to incorrect authentication of individuals [46971]. (e) byzantine: The articles do not mention any software failure incident related to a byzantine behavior where the system behaves erroneously with inconsistent responses and interactions. (f) other: The other behavior observed in the software failure incident described in the articles is the vulnerability of biometric authentication systems to spoofing attacks using various methods like 3-D face replicas, virtual reality renders, and 3-D printed masks. This highlights a security flaw in the authentication systems [46971].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence no_consequence (a) death: There is no mention of any individuals losing their lives due to the software failure incident in the provided article [46971].
Domain information (a) The failed system in the article was related to the information industry, specifically in the field of computer vision and facial recognition systems. The system was intended to support authentication and security measures using biometric data such as facial features [46971].

Sources

Back to List