| Recurring |
multiple_organization |
(a) The software failure incident related to facial recognition spoofing using 3-D rendering technology has happened again at multiple organizations. The researchers from the University of North Carolina demonstrated a method to defeat facial recognition systems by creating 3-D facial models based on publicly available photos and using them to spoof authentication systems [46971]. This incident highlights the vulnerability of facial recognition systems to such attacks and the need for improved security measures to prevent spoofing. |
| Phase (Design/Operation) |
design, operation |
(a) The software failure incident related to the design phase can be seen in the article where researchers from the University of North Carolina presented a system that used digital 3-D facial models based on publicly available photos to defeat facial recognition systems. The researchers collected images of volunteers through image search engines, professional photos, and publicly available assets on social networks like Facebook, LinkedIn, and Google+ to create 3-D renders of faces that successfully spoofed four out of five systems they tested [46971].
(b) The software failure incident related to the operation phase is evident in the same article where the researchers tested their virtual reality face renders on five authentication systems, including those available from consumer software vendors like the Google Play Store and the iTunes Store. They were able to trick all five systems in every case they tested using control photos and were successful in tricking four of the systems with public web photos, with success rates ranging from 55 percent to 85 percent [46971]. |
| Boundary (Internal/External) |
within_system |
(a) within_system: The software failure incident discussed in the articles is related to facial recognition systems being tricked by 3-D rendering and virtual reality technology based on publicly available photos [46971]. The failure originates from within the system as the facial recognition systems were unable to distinguish between real faces and 3-D renders created from online photos, highlighting a vulnerability in the authentication process. The attack was successful in spoofing four out of the five systems tested, indicating a failure within the system's ability to accurately authenticate individuals based on facial features. |
| Nature (Human/Non-human) |
non-human_actions |
(a) The software failure incident discussed in the articles is primarily related to non-human actions. The failure occurred due to the vulnerability of facial recognition systems to be tricked by 3-D rendering based on publicly available photos and displayed with mobile virtual reality technology. The researchers were able to successfully spoof four out of the five systems they tested using this method, highlighting the risks associated with authenticating identity with biometrics [46971]. The incident showcases how software systems can fail due to factors introduced without direct human participation, such as the exploitation of publicly available data and the limitations of current authentication technologies. |
| Dimension (Hardware/Software) |
hardware |
(a) The software failure incident related to hardware: The articles discuss the potential need for consumer face authentication systems to evolve by incorporating hardware and sensors beyond just mobile cameras or webcams. It is mentioned that some vendors, like Microsoft with its Windows Hello software, already have commercial solutions that leverage alternative hardware. However, there is a challenge in implementing such hardware on mobile devices where space is limited. The discussion revolves around the idea that adding specialized components like IR cameras or structured light projectors may be necessary to enhance security mechanisms but comes with a cost-benefit analysis for hardware vendors [46971].
(b) The software failure incident related to software: The incident discussed in the articles is not directly related to a failure originating in software. Instead, it focuses on the vulnerability of facial recognition systems to spoofing attacks using 3-D face replicas created from publicly available photos. The attack was successful in tricking multiple authentication systems by using digital 3-D facial models based on online images. The incident highlights the risks associated with biometric data being compromised or publicly available, leading to potential exploitation [46971]. |
| Objective (Malicious/Non-malicious) |
malicious |
(a) The software failure incident described in the articles is malicious in nature. Researchers from the University of North Carolina presented a system at the Usenix security conference that used digital 3-D facial models based on publicly available photos to defeat facial recognition systems. The attack successfully spoofed four out of the five systems they tried, demonstrating a method of stealing a face through 3-D rendering and Internet stalking [46971]. The incident involved intentionally tricking facial recognition systems, highlighting the malicious intent behind the failure. |
| Intent (Poor/Accidental Decisions) |
unknown |
(a) The intent of the software failure incident:
The software failure incident described in the article was not due to poor decisions but rather intentional research conducted by security and computer vision specialists from the University of North Carolina. The researchers demonstrated a method of stealing a face using 3-D rendering and publicly available photos to defeat facial recognition systems. They collected images of volunteers through online sources like image search engines, professional photos, and social networks to create virtual reality face renders that successfully spoofed authentication systems [46971]. This incident was a deliberate attempt to highlight the vulnerabilities of facial recognition systems rather than a result of poor decisions. |
| Capability (Incompetence/Accidental) |
accidental |
(a) The software failure incident reported in the articles can be attributed to development incompetence. The incident involved researchers from the University of North Carolina who demonstrated a method of stealing a face by using digital 3-D facial models based on publicly available photos to defeat facial recognition systems [46971]. The researchers collected images of volunteers through image search engines, professional photos, and publicly available assets on social networks like Facebook, LinkedIn, and Google+ to create virtual reality face renders that successfully spoofed authentication systems [46971]. This highlights the downside of authenticating identity with biometrics when biometric data is compromised or publicly available, emphasizing the risk associated with using facial recognition systems that can be easily tricked due to the lack of robust security measures.
(b) The software failure incident can also be considered accidental as it involved researchers accidentally discovering a method to exploit facial recognition systems using 3-D face replicas based on publicly available photos [46971]. The researchers found that even individuals who make an active effort to protect their privacy online had their faces easily replicated and used to trick authentication systems, indicating the accidental vulnerability of such systems to attacks leveraging publicly available data [46971]. This accidental discovery underscores the need for continuous improvement in security measures to prevent unauthorized access and misuse of personal data through facial recognition technology. |
| Duration |
temporary |
The software failure incident discussed in the articles is more aligned with a temporary failure rather than a permanent one. The incident involved a specific attack method that exploited vulnerabilities in facial recognition systems by using 3-D rendering and publicly available photos to trick authentication systems [46971]. The attack was successful in spoofing four out of the five systems tested, indicating a temporary failure caused by certain circumstances, such as the method used by the researchers to create 3-D face replicas and the limitations of the current authentication systems in detecting such spoofing attempts. The articles also mention potential improvements in defense mechanisms against such attacks, suggesting that the failure was not permanent but rather a result of existing vulnerabilities that can be addressed with advancements in technology and security measures. |
| Behaviour |
omission, value, other |
(a) crash: The articles do not mention any software failure incident related to a crash where the system loses state and does not perform any of its intended functions.
(b) omission: The software failure incident described in the articles is related to the omission of the system to perform its intended functions. Specifically, the facial recognition systems were tricked by 3-D rendering and virtual reality technology based on publicly available photos, leading to the omission of correctly identifying the individuals [46971].
(c) timing: The articles do not mention any software failure incident related to timing, where the system performs its intended functions correctly but too late or too early.
(d) value: The software failure incident described in the articles is related to the system performing its intended functions incorrectly. The facial recognition systems were spoofed by 3-D face replicas, leading to incorrect authentication of individuals [46971].
(e) byzantine: The articles do not mention any software failure incident related to a byzantine behavior where the system behaves erroneously with inconsistent responses and interactions.
(f) other: The other behavior observed in the software failure incident described in the articles is the vulnerability of biometric authentication systems to spoofing attacks using various methods like 3-D face replicas, virtual reality renders, and 3-D printed masks. This highlights a security flaw in the authentication systems [46971]. |