Multiple Case Decision
Reports on the War in Gaza
In these two summary decisions, the Board reviewed two posts reporting on the war in Gaza.
2 cases included in this bundle
FB-VXKB1TZ5
Case about dangerous individuals and organizations on Facebook
IG-50OFM0LV
Case about dangerous individuals and organizations on Instagram
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
In these two summary decisions, the Board reviewed two posts reporting on the war in Gaza. After the Board brought these two appeals to Meta’s attention, the company reversed its original decisions and restored both posts.
About the Cases
In the first case, an Instagram user posted a short video in February 2024 from a Channel 4 News (UK) report on the killing of a Palestinian child. The video has a caption that expressly indicates it does not promote dangerous organizations and individuals and that it is a story about a Palestinian family and humanitarian workers.
In the second case, a Facebook user posted a video in January 2024 from Al-Jazeera reporting on the war in Gaza. The clip contains reporting and analysis on hostage release negotiations between Israel and Hamas.
Meta originally removed the posts from Instagram and Facebook, respectively, citing its Dangerous Organizations and Individuals policy. Under this policy, the company removes “glorification,” “support” and “representation” of designated entities, their leaders, founders or prominent members, and unclear references to them.
In their appeals to the Board, both users stated that the videos were reports from media outlets and did not violate Meta’s Community Standards. After the Board brought these two cases to Meta’s attention, the company determined that the posts did not violate its policies and restored both pieces of content to its platforms.
Board Authority and Scope
The Board has authority to review Meta’s decisions following appeals from the users whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Cases
These cases highlight errors in enforcement of an exception to the Dangerous Organizations and Individuals policy that allows content “reporting on, neutrally discussing or condemning dangerous organizations and individuals and their activities,” in order to safeguard a space for “social and political discourse.” This kind of error undermines genuine efforts to report on and raise awareness about the ongoing conflict in Gaza and other conflict-affected regions.
The Board has issued several recommendations to improve enforcement of Meta’s Dangerous Organizations and Individuals policy. In a policy advisory opinion, the Board asked Meta to “explain the methods it uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy,” (Referring to Designated Dangerous Individuals as “Shaheed,” recommendation no. 6). The Board has also urged Meta to “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors,” a recommendation Meta reported implementation on but did not publish information to demonstrate implementation ( Mention of the Taliban in News Reporting, recommendation no. 5). Furthermore, the Board has recommended that Meta “add criteria and illustrative examples to Meta’s Dangerous Organizations and Individuals policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” a recommendation for which Meta demonstrated implementation through published information ( Shared Al Jazeera Post, recommendation no. 1).
Decision
The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s corrections of its initial errors once the Board brought these cases to Meta’s attention.