Overturned
Supreme Court in White Hoods
A user appealed Meta’s decision to remove a Facebook post that contains an edited image of the Supreme Court of the United States, depicting six of the nine members wearing the robes of the Ku Klux Klan.
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.
Case Summary
A user appealed Meta’s decision to remove a Facebook post that contains an edited image of the Supreme Court of the United States, depicting six of the nine members wearing the robes of the Ku Klux Klan. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Case Description and Background
In July 2023, a user posted an edited image on Facebook that depicts six justices of the Supreme Court of the United States as members of the Ku Klux Klan while three justices, considered to be more liberal, appear unaltered. The post contained no caption and received fewer than 200 views.
The post was removed for violating Meta’s Dangerous Organizations and Individuals policy. This policy prohibits content that contains praise, substantive support or representation of organizations or individuals that Meta deems as dangerous.
In their appeal to the Board, the user emphasized that the post was intended to be a political critique rather than an endorsement of the Ku Klux Klan. The user stated that the content highlights what the user regards as the six justices’ “prejudicial, hateful, and destructive attitudes toward women, women’s rights to choose abortions, the gay, lesbian, transgender and queer communities, and the welfare of other vulnerable groups.”
After the Board brought this case to Meta’s attention, the company determined the content did not violate Meta’s Dangerous Organizations and Individuals policy and its removal was incorrect. The company then restored the content to Facebook.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.
Case Significance
The case highlights an error in Meta’s enforcement of its Dangerous Organizations and Individuals policy, specifically relating to content shared as political critique. Continued similar errors could significantly limit important free expression by users, and the company should make reducing such errors a high priority.
The Dangerous Organizations and Individuals Community Standard is the source of many erroneous takedowns and has been addressed in a number of prior Board decisions. In one earlier decision, the Board asked Meta to “explain in the Community Standards how users can make the intent behind their posts clear to Facebook.” To the same end, the Board also recommended that the company publicly disclose its list of designated individuals and organizations, and that “Facebook should also provide illustrative examples to demonstrate the line between permitted and prohibited content, including in relation to application of the rule clarifying what 'support' excludes,” ( Ocalan’s Isolation decision, recommendation no. 6). Meta committed to partial implementation of this recommendation. Additionally, the Board urged Meta to “include more comprehensive information on error rates for enforcing rules on 'praise' and 'support' of dangerous individuals and organizations,” ( Ocalan’s Isolation decision, recommendation no. 12). Meta declined to implement this recommendation following a feasibility assessment.
The Board emphasizes that full implementation of these recommendations could reduce the number of enforcement errors under Meta’s Dangerous Organizations and Individuals policy.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.