Incident: Facebook's Algorithmic Flaw: Amplifying Anger and Misinformation

Published Date: 2021-10-26

Postmortem Analysis
Timeline 1. The software failure incident involving Facebook's algorithm favoring "angry" emoji reactions happened over a period of time, starting from at least 2017 when Facebook's ranking algorithm began treating emoji reactions as five times more valuable than likes [119802].
System The system that failed in the software failure incident reported in the news article is: 1. Facebook's ranking algorithm that favored emoji reactions, particularly the "angry" reaction, and pushed more emotional and provocative content into users' news feeds [119802].
Responsible Organization 1. Facebook engineers [119802]
Impacted Organization 1. Users of Facebook [119802]
Software Causes 1. The software cause of the failure incident was the algorithm programmed by Facebook engineers to prioritize emoji reactions, particularly the "angry" reaction, over likes in determining what content users see in their news feeds [119802]. 2. The algorithm's emphasis on promoting emotional and provocative content, including content likely to make users angry, led to the spread of misinformation, toxicity, and low-quality news on the platform [119802]. 3. The software failure incident involved the manipulation of various levers by Facebook engineers to shape the flow of information and conversation on the platform, influencing users' emotions, political campaigns, and other aspects [119802]. 4. Facebook's algorithm, which relied on sophisticated machine-learning techniques to predict engagement, was designed to prioritize certain types of posts based on a single score generated for each post in each user's feed [119802]. 5. The failure incident also involved the experimentation culture at Facebook, where engineers conducted experiments to manipulate users' emotions and interactions on the platform, raising ethical concerns [119802].
Non-software Causes 1. Favoring controversial and anger-inducing posts to increase user engagement, leading to the spread of misinformation and toxicity [119802] 2. Human judgments and subjective decisions underlying Facebook's news feed algorithm, impacting the type of content shown to users [119802] 3. Ethical concerns regarding manipulating the emotional valence of posts shown to users to influence their moods [119802]
Impacts 1. The software failure incident at Facebook, where the algorithm favored "angry" emoji reactions, led to the promotion of more emotional and provocative content, including misinformation, toxicity, and low-quality news, into users' news feeds [119802]. 2. The incident undermined the efforts of Facebook's content moderators and integrity teams who were trying to combat toxic and harmful content on the platform [119802]. 3. The incident highlighted the subjective human judgments underlying Facebook's news feed algorithm, showing how human decisions influenced what content billions of users saw on the platform [119802]. 4. The incident revealed that the algorithmic promotion of certain types of content, such as anger-inducing posts, could have negative impacts on users' experiences, leading to potential harms like misinformation and toxicity [119802]. 5. The incident prompted Facebook to make adjustments to the algorithm, such as reducing the weight of the "angry" reaction to zero and boosting the value of other reactions like "love" and "sad," in an effort to mitigate the negative impacts of the software failure [119802].
Preventions 1. Implementing stricter oversight and review processes for algorithm changes: By having a more robust system in place to evaluate the potential impacts of algorithm changes, Facebook could have identified the negative consequences of favoring "angry" reactions earlier and taken corrective actions [119802]. 2. Conducting thorough testing and analysis before implementing changes: Prior to rolling out changes to the algorithm, Facebook could have conducted more comprehensive testing and analysis to understand the potential implications of giving extra weight to certain types of reactions, such as the "angry" emoji, on user experience and content quality [119802]. 3. Listening to internal warnings and feedback: Facebook employees raised concerns about the negative effects of prioritizing controversial and anger-inducing content, but these warnings were not always heeded. By actively listening to internal feedback and addressing concerns promptly, Facebook could have potentially avoided the software failure incident [119802].
Fixes 1. Adjusting the algorithm to stop amplifying content that might subvert democratic norms by valuing angry emoji reactions less or removing the button altogether [119802]. 2. Implementing mechanisms to demote content that receives disproportionately angry reactions [119802]. 3. Fine-tuning signals and weightings to mitigate harm caused by certain reactions, such as boosting "love" and "sad" reactions while reducing the weight of all reactions to one and a half times that of a like [119802]. 4. Setting the weight on the angry reaction to zero to reduce misinformation, disturbing content, and graphic violence on the platform [119802].
References 1. Internal documents from Facebook [119802] 2. Disclosures made to the Securities and Exchange Commission 3. Testimonies provided to Congress 4. Statements from whistleblower Frances Haugen 5. Comments from Facebook spokesperson Dani Lever

Software Taxonomy of Faults

Category Option Rationale
Recurring one_organization (a) The software failure incident related to Facebook's algorithm favoring "angry" emoji reactions and promoting emotionally provocative content has happened again within the same organization. The incident involved Facebook engineers giving extra value to emoji reactions, including "angry," which led to the promotion of content likely to make users angry and spread misinformation. Despite internal warnings and debates, the algorithm continued to prioritize these reactions, causing harm by amplifying toxic and harmful content on the platform [119802]. (b) The software failure incident of promoting emotionally provocative content by favoring "angry" emoji reactions has not been explicitly mentioned to have occurred at other organizations or with their products and services in the provided articles.
Phase (Design/Operation) unknown The articles do not provide specific information about a software failure incident related to the development phases such as design or operation.
Boundary (Internal/External) within_system, outside_system The software failure incident discussed in the articles can be categorized as both within_system and outside_system: (a) within_system: The failure was within the system as it was caused by Facebook's own engineers programming the algorithm to prioritize emoji reactions, particularly the "angry" reaction, which led to the promotion of emotional and provocative content, including misinformation and toxicity, in users' news feeds [119802]. (b) outside_system: The failure was also influenced by external factors such as user behavior and societal implications. The algorithm's manipulation of user interactions and content promotion had effects on users' emotions, political campaigns, and societal norms, indicating that external factors played a role in the software failure incident [119802].
Nature (Human/Non-human) non-human_actions, human_actions (a) The software failure incident occurring due to non-human actions: The software failure incident discussed in the articles is primarily related to the algorithmic decisions made by Facebook engineers to prioritize emoji reactions, particularly the "angry" reaction, in determining what content users see in their news feeds. The algorithm was programmed to give extra value to emoji reactions, including "angry," which led to the promotion of more emotional and provocative content, including misinformation and toxic content, into users' feeds [119802]. (b) The software failure incident occurring due to human actions: The failure can also be attributed to human actions, specifically the decisions made by Facebook engineers and executives to prioritize certain types of engagement signals, such as emoji reactions, over others. The internal debate and decision-making process within Facebook regarding the weighting of different reactions and the impact on user experience and content quality highlight the role of human judgment in shaping the algorithm and its consequences [119802].
Dimension (Hardware/Software) software (a) The articles do not provide information about a software failure incident occurring due to contributing factors originating in hardware. (b) The software failure incident discussed in the articles is related to the internal algorithm of Facebook's news feed system. Facebook engineers programmed the algorithm to prioritize emoji reactions, including the "angry" reaction, which led to the promotion of emotional and provocative content, including misinformation and toxicity, in users' feeds [119802]. The incident involved a flaw in the algorithm's design and weighting system, which resulted in the amplification of harmful content on the platform.
Objective (Malicious/Non-malicious) non-malicious (a) The software failure incident described in the articles can be categorized as non-malicious. The failure was not due to any malicious intent but rather stemmed from the decisions made by Facebook engineers to prioritize certain types of content based on user reactions, particularly the "angry" emoji. This decision led to the amplification of misinformation, toxicity, and low-quality news on the platform, ultimately undermining the efforts of Facebook's content moderators and integrity teams [119802]. The incident highlights how the manipulation of algorithmic levers and the weighting of different types of user engagement signals can have unintended negative consequences on the platform's content quality and user experience.
Intent (Poor/Accidental Decisions) poor_decisions, accidental_decisions (a) The intent of the software failure incident was related to poor decisions made by Facebook engineers and executives. They intentionally programmed the algorithm to give extra value to emoji reactions, including the "angry" reaction, in order to push more emotional and provocative content into users' news feeds. This decision was based on the theory that posts generating lots of reaction emoji would keep users more engaged, which was seen as key to Facebook's business success. However, this intentional decision led to the amplification of misinformation, toxicity, and low-quality news on the platform, undermining the efforts of content moderators and integrity teams [119802]. (b) The software failure incident was also influenced by accidental decisions or unintended consequences. Facebook's own researchers raised concerns about the critical flaw in favoring controversial and anger-inducing posts, as it could inadvertently lead to more spam, abuse, and clickbait on the platform. The company's data scientists confirmed that posts sparking angry reactions were disproportionately likely to include misinformation and harmful content. This unintended consequence of amplifying negative content was not the initial goal but rather a result of the decisions made to prioritize engagement and emotional reactions on the platform [119802].
Capability (Incompetence/Accidental) development_incompetence (a) The articles provide insights into a software failure incident related to development incompetence. Facebook engineers programmed the algorithm to use reaction emoji signals to push more emotional and provocative content into users' news feeds, including content likely to make them angry. This decision was based on the theory that posts generating lots of reaction emoji would keep users more engaged, which was crucial for Facebook's business. However, internal documents revealed that favoring controversial and anger-inducing posts could inadvertently open the door to more spam, abuse, and clickbait on the platform [119802]. Furthermore, Facebook's own researchers discovered in 2019 that posts sparking angry reactions were disproportionately likely to contain misinformation, toxicity, and low-quality news. This systematic amplification of negative content on the platform undermined the efforts of Facebook's content moderators and integrity teams who were striving to combat toxic and harmful content. The internal debate over the impact of the "angry" emoji on the platform highlighted the subjective human judgments underlying Facebook's news feed algorithm, emphasizing the complexities and potential flaws in the decision-making process [119802]. (b) The articles do not provide specific information about a software failure incident related to factors introduced accidentally.
Duration temporary The software failure incident discussed in the articles can be categorized as a temporary failure. This temporary failure was due to contributing factors introduced by certain circumstances but not all. The incident involved Facebook's algorithm favoring "angry" emoji reactions, which led to the amplification of emotional and provocative content, including misinformation and toxicity, in users' news feeds [119802]. The failure was not permanent as adjustments were made over time to address the negative impacts of the angry reaction emoji, such as reducing its weight to zero and boosting other reactions like "love" and "sad" [119802].
Behaviour omission, value, other (a) crash: The articles do not mention any specific instances of a crash related to the software failure incident. (b) omission: The software failure incident related to the Facebook algorithm favoring "controversial" posts, including those that make users angry, resulted in the omission of preventing the spread of misinformation, toxicity, and low-quality news. The incident led to the amplification of harmful content, undermining the efforts of Facebook's content moderators and integrity teams [119802]. (c) timing: The articles do not mention any specific instances of a timing-related failure in the software failure incident. (d) value: The software failure incident involved a failure related to the system performing its intended functions incorrectly. Specifically, the algorithm gave extra value to emoji reactions, including 'angry,' which led to the promotion of emotional and provocative content, including misinformation and toxicity, in users' news feeds [119802]. (e) byzantine: The software failure incident highlighted the highly subjective human judgments underlying Facebook's news feed algorithm, which is described as a byzantine machine-learning software that decides what kinds of posts users see. The incident revealed the complex and opaque nature of the algorithm, with human decisions influencing the content users are exposed to [119802]. (f) other: The software failure incident also involved the system behaving in a way not described in the options provided. This includes the manipulation of various levers by Facebook engineers to shape the flow of information and conversation on the platform, such as considering factors like the number of long comments, type of comments, computing load, and user engagement to predict post likelihood and engagement. The incident showcased the intricate and multifaceted nature of the algorithm's decision-making process [119802].

IoT System Layer

Layer Option Rationale
Perception None None
Communication None None
Application None None

Other Details

Category Option Rationale
Consequence property, theoretical_consequence, other (property) The software failure incident discussed in the articles did not directly result in any physical harm, death, or impact on basic needs like food or shelter. However, the incident had significant consequences related to property and data. The software failure led to the amplification of misinformation, toxicity, and low-quality news on Facebook's platform, impacting users' experience and potentially influencing various aspects of society, including political campaigns and public discourse [119802]. The incident also highlighted the challenges faced by Facebook's content moderators and integrity teams in combating harmful content on the platform. Additionally, the software failure incident involved the manipulation of various levers and signals within Facebook's algorithm, affecting the distribution and visibility of content, which could have implications for users' interactions and the type of information they are exposed to [119802].
Domain information The software failure incident discussed in the articles is related to the industry of information (a). The incident specifically involves Facebook's algorithm that determines what content users see in their news feeds, with a focus on how the algorithm favored emotional and provocative content, including content likely to make users angry. This failure led to the amplification of misinformation, toxicity, and low-quality news on the platform, impacting the production and distribution of information [119802].

Sources

Back to List