Overturned
Washington Post Article on Israel-Palestine
April 4, 2024
A user appealed Meta’s decision to remove a Facebook post with a link to a Washington Post article that addressed the chronology of the Israeli-Palestinian conflict. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Case Summary
A user appealed Meta’s decision to remove a Facebook post with a link to a Washington Post article that addressed the chronology of the Israeli-Palestinian conflict. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Case Description and Background
In October 2023, a Facebook user posted a link to a Washington Post article covering the chronology of the Israeli-Palestinian conflict. The article preview, which was automatically included with the link, mentions Hamas. The user did not add a caption to accompany the post or provide any further context.
This Facebook post was removed under Meta’s Dangerous Organizations and Individuals policy, which prohibits representation of and certain speech about the groups and people the company judges as linked to significant real-world harm.
In their appeal to the Board, the user emphasized that the post was intended to report on the current Israel-Hamas conflict and was not meant to provide support for Hamas, or any other dangerous organization.
After the Board brought this case to Meta’s attention, the company determined the content did not violate the Dangerous Organizations and Individuals policy as the post references Hamas in a news-reporting context, which is allowed under the policy. The company then restored the content to Facebook.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.
Case Significance
This case highlights an instance of Meta over-enforcing its Dangerous Organizations and Individuals policy, specifically news reporting on entities the company designates as dangerous. This is a recurring problem, which has been particularly frequent during the current Israel-Hamas conflict, in which one of the parties is a designated organization. The Board has issued numerous recommendations relating to the news reporting allowance under the Dangerous Organizations and Individuals policy. Continued errors in applying this important allowance can significantly limit users’ free expression, the public’s access to information, and impair public discourse.
In a previous decision, the Board recommended that Meta “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting, recommendation no. 5). Meta reported implementation as work it already does, without publishing information to prove so. The Board also recommended that Meta “add criteria and illustrative examples to its Dangerous Organizations and Individuals policy to increase understanding of the exceptions for neutral discussion condemnation and news reporting,” ( Shared Al Jazeera Post, recommendation no. 1). The implementation of this recommendation was demonstrated through published information. Furthermore, the Board recommended that Meta “include more comprehensive information on error rates for enforcing rules on ‘praise’ and ‘support’ of dangerous individuals and organizations” in transparency reporting, ( Ocalan’s Isolation, recommendation no. 12). Meta declined to implement this recommendation after conducting a feasibility assessment. In an update to its policy dated December 29, 2023, Meta now uses the term “glorification” instead of “praise” in its Community Standard.
The Board believes that full implementation of these recommendations could reduce the number of enforcement errors under Meta’s Dangerous Organizations and Individuals policy.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.