Overturned
Derogatory Image of Candidates for U.S. Elections
A user appealed Meta’s decision to remove content containing an altered and derogatory depiction of U.S. presidential candidate Kamala Harris and her running mate Tim Walz.
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company's attention and include information about Meta's acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta's decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta’s decision to remove content containing an altered and derogatory depiction of U.S. presidential candidate Kamala Harris and her running mate Tim Walz. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
About the Case
In August 2024, a Facebook user posted an altered picture based on the poster for the 1994 comedy film “Dumb and Dumber.” In the altered image, the faces of the original actors are replaced by the U.S. presidential candidate, Vice President Kamala Harris, and her running mate, Minnesota Governor Tim Walz. As in the original poster, the two figures are grabbing each other’s nipples through their clothing. The content was posted with a caption that includes the emojis “🤷♂️🖕🖕.” Meta initially removed the user’s post from Facebook under its Bullying and Harassment Community Standard, which prohibits “derogatory sexualized photoshop or drawings.” When the Board brought this case to Meta’s attention, the company determined that the removal of the content was incorrect, restoring the post to Facebook. Meta explained that its Community Standards were not violated in this case because the company does not consider that pinching a person’s nipple through their clothing qualifies as sexual activity.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
In the Explicit AI Images of Female Public Figures cases, the Board decided that two AI-generated images violated Meta’s rule that prohibits “derogatory sexualized photoshop” under the Bullying and Harassment policy. Both images had been edited to show the faces of real public figures with a different (real or fictional) nude body. In this case, however, the Board highlights the overenforcement of Meta’s Bullying and Harassment policy with respect to satire and political speech in the form of a non-sexualized derogatory depiction of political figures. It also points to the dangers that overenforcing the Bullying and Harassment policy can have, especially in the context of an election, as it may lead to the excessive removal of political speech and undermine the ability to criticize government officials and political candidates, including in a sarcastic manner. This post is nothing more than a commonplace satirical image of prominent politicians and is instantly recognizable as such. In the context of elections, the Board has previously recommended that Meta should develop a framework for evaluating its election integrity efforts in order to provide the company with relevant data to improve its content moderation system as a whole and decide how to best employ its resources in electoral contexts ( Brazilian General’s Speech, recommendation no. 1). Meta has reported progress on implementing this recommendation. Nonetheless, the company’s failure to recognize the nature of this post and treat it accordingly raises serious concerns about the systems and resources Meta has in place to effectively make content determinations in such electoral contexts.
The Board has previously urged Meta to put in place adequate procedures for evaluating content in its relevant context. For example, the Board stated that Meta should: “Make sure it has adequate procedures in place to assess satirical content and relevant context properly,” ( “Two Buttons” Meme, recommendation no. 3). Meta reported implementation of this recommendation but has yet to publish information to demonstrate this.
The Board has also stated that the Bullying and Harassment Community Standard should “clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law,” ( Pro-Navalny Protests in Russia, recommendation no. 2). Meta declined to implement this recommendation following a feasibility assessment. Finally, the Board stated Meta should: “Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target,” ( Pro-Navalny Protests in Russia, recommendation no. 4). Meta declined to implement this recommendation after a feasibility assessment.
The Board believes that full implementation of such recommendations calling for effective assessment of context and the development of an election integrity framework would contribute to decreasing the number of enforcement errors.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.