Incident: Google Photos Mislabeling Black People as Gorillas Incident

Published Date: 2015-07-01

Postmortem Analysis
Timeline 1. The software failure incident where Google's image recognition software mislabelled photographs of black people as gorillas happened in July 2015 as per the article published on July 1, 2015 [36795].
System 1. Google Photos image recognition software [36795]
Responsible Organization 1. Google's image recognition software mislabelled photographs of black people as gorillas, causing the software failure incident [36795].
Impacted Organization 1. Users of Google Photos [36795]
Software Causes 1. The software failure incident was caused by Google's image recognition software mislabeling photographs of black people as gorillas due to a flaw in the auto-tagging feature of the Google Photos application [36795].
Non-software Causes 1. Lack of diverse image data collection: The incident was caused by a lack of diverse image data collected by Google Photos, leading to mislabeling of black people as gorillas [36795]. 2. Insufficient testing: The software failure incident occurred due to insufficient testing of the image recognition software, which failed to detect faces accurately, especially dark-skinned faces [36795].
Impacts 1. The software failure incident led to images of black people being mislabelled as gorillas by Google's image recognition software, causing outrage among users and embarrassment for the company [36795]. 2. The incident triggered a series of Tweets from the affected individual, Jacky Alcine, highlighting the problem to Google and the public, leading to a public apology from Google and a promise to fix the issue [36795]. 3. Google's chief architect of social, Yonatan Zunger, expressed horror at the incident and mentioned that engineers were working on various fixes to prevent similar issues in the future, indicating the seriousness of the impact [36795].
Preventions 1. Implementing more diverse and comprehensive training data for the image recognition software to improve accuracy, especially in recognizing dark-skinned faces [36795]. 2. Conducting thorough testing and quality assurance processes to identify and address potential biases or inaccuracies in the software's algorithms [36795]. 3. Enabling more robust error handling mechanisms within the software to quickly detect and rectify mislabeling issues before they impact users [36795].
Fixes 1. Implementing longer-term fixes around linguistics and words to be careful about in photos of people, as well as improving image recognition itself, especially for better recognition of dark-skinned faces [36795].
References 1. Tweets from computer programmer Jacky Alcine [36795] 2. Responses from Yonatan Zunger, chief architect of social at Google [36795]

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization (a) The software failure incident of mislabeling photographs of black people as gorillas happened again at Google. Just over a month before this incident, Flickr's autotagging system also placed potentially offensive tags on images, including mislabeling concentration camps as 'jungle gyms' and people as apes [36795]. (b) The incident of mislabeling photographs of black people as gorillas has not been explicitly mentioned to have happened at multiple organizations in the provided articles.
Phase (Design/Operation) design, operation (a) The software failure incident related to the design phase can be seen in the mislabeling of photographs of black people as gorillas by Google's image recognition software in Google Photos. This incident occurred due to a flaw in the system's auto-tagging feature, which was intended to help organize images but mislabeled images of individuals as gorillas [36795]. (b) The software failure incident related to the operation phase is evident in the mislabeling of images even after a fix had been issued. Despite efforts to correct the problem, the user reported that two photos were still showing up under the terms "gorilla" and "gorillas." This indicates a failure in the operation of the system, where the fix did not completely resolve the issue [36795].
Boundary (Internal/External) within_system (a) The software failure incident related to Google's image recognition software mislabeling photographs of black people as gorillas falls under the category of within_system failure. The incident was caused by a flaw in the image recognition software itself, which led to the mislabeling of images uploaded to Google Photos [36795]. The fault was acknowledged by Google, and engineers were working on fixes to prevent similar issues in the future [36795]. Additionally, Google mentioned that the image labeling technology was still in its infancy and not yet perfect, indicating an internal issue with the software's capabilities [36795].
Nature (Human/Non-human) non-human_actions (a) The software failure incident in Article 36795 occurred due to non-human actions. Specifically, Google's image recognition software mislabelled photographs of black people as gorillas without any human intention or involvement. This incident was a result of the software's auto-tagging feature misidentifying the individuals in the images, leading to outrage and apologies from Google. The fault was attributed to the software's image recognition algorithms and the lack of accurate data samples for dark-skinned faces, rather than any deliberate human actions [36795].
Dimension (Hardware/Software) software (a) The software failure incident related to hardware: - The incident reported in the article [36795] was not due to hardware issues but rather a software failure where Google's image recognition software mislabelled photographs of black people as gorillas. The issue originated from the software's image recognition algorithms misidentifying the individuals in the photos. (b) The software failure incident related to software: - The software failure incident reported in article [36795] was specifically related to software issues. Google's image recognition software within the Google Photos application mislabelled images of black people as gorillas, showcasing a flaw in the software's image recognition capabilities. This incident highlighted a software failure originating from the algorithms and programming of the image recognition feature in the application.
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident related to the mislabeling of photographs of black people as gorillas by Google's image recognition software was non-malicious. The incident was attributed to a flaw in the software's image recognition algorithm, which led to the misclassification of images without any malicious intent involved. Google issued an apology for the mistake and mentioned that their image labeling technology was still in its early stages and not yet perfect [36795].
Intent (Poor/Accidental Decisions) accidental_decisions (a) The intent of the software failure incident was accidental_decisions. The incident where Google's image recognition software mislabelled photographs of black people as gorillas was unintentional and a mistake made by the software. Google issued an apology and mentioned that their image labelling technology was still in its infancy and not yet perfect, indicating that the mislabelling was not a deliberate poor decision but rather an unintended consequence of the software's functioning [36795].
Capability (Incompetence/Accidental) development_incompetence (a) The software failure incident related to development incompetence is evident in the mislabeling of photographs of black people as gorillas by Google's image recognition software in the Google Photos application. This incident occurred due to a flaw in the software's auto-tagging feature, which led to outrage among users and necessitated an apology from Google [36795]. (b) The accidental aspect of the software failure incident is highlighted by Google's acknowledgment that the mislabeling of images as gorillas was unintentional and a mistake. Google expressed genuine sorrow for the error and mentioned that their image labeling technology was still in its early stages and not yet perfect, indicating that the incident was accidental rather than intentional [36795].
Duration temporary (a) The software failure incident in the article was temporary. The incident occurred when Google's image recognition software mislabelled photographs of black people as gorillas [36795]. Google acknowledged the mistake and issued a fix for the problem. The chief architect of social at Google mentioned that engineers were working on various fixes to prevent similar issues in the future [36795]. Additionally, Google turned off the ability for photographs to be grouped under the label 'gorilla' to stop the problem, and they were also working on longer-term fixes to improve image recognition, especially for dark-skinned faces [36795].
Behaviour omission, value, other (a) crash: The software failure incident in the article is not related to a crash where the system loses state and does not perform any of its intended functions. [36795] (b) omission: The software failure incident in the article is related to the system omitting to perform its intended functions at an instance(s). Specifically, Google's image recognition software mislabelled photographs of black people as gorillas, omitting the correct labeling of the individuals in the images. This omission led to outrage and necessitated an apology from Google. [36795] (c) timing: The software failure incident in the article is not related to the system performing its intended functions correctly but too late or too early. [36795] (d) value: The software failure incident in the article is related to the system performing its intended functions incorrectly. The mislabeling of individuals as gorillas by Google's image recognition software is an example of the system providing incorrect labels for the images uploaded. [36795] (e) byzantine: The software failure incident in the article is not related to the system behaving erroneously with inconsistent responses and interactions. [36795] (f) other: The other behavior observed in the software failure incident is the system mislabeling images of individuals as gorillas, which led to a significant backlash and necessitated immediate action from Google to prevent such incorrect labeling from appearing. This behavior falls under the category of providing inaccurate and potentially offensive labels for images. [36795]

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence no_consequence (a) death: People lost their lives due to the software failure - There is no mention of any deaths resulting from the software failure incident reported in the articles. [36795]
Domain information (a) The software failure incident reported in the news article is related to the information industry. Google's image recognition software mislabelled photographs of black people as gorillas within its Google Photos application, which is designed to help organize images uploaded to the service and make searching easier [36795].

Sources

Back to List