Overturned

Cartoon About Rape

A user appealed Meta’s decision to leave up a Facebook post which contained a cartoon that depicts an individual drugging another person with the implication of impending rape.

Type of Decision

Summary

Policies and Topics

Topic
Safety, Sex and gender equality, Violence
Community Standard
Sexual exploitation of adults

Region/Countries

Location
Mexico

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to leave up a Facebook post which contained a cartoon that depicts an individual drugging another person with the implication of impending rape. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

About the Case

In April 2024, a user in Mexico reshared a Facebook post that contained a cartoon about rape. The cartoon depicts two people, who appear to be men, entering a home. The resident of the home apologizes that their home is a mess but when they enter, it is perfectly clean. The other individual states that they only clean when they intend to engage in sexual intercourse. The resident of the home states “me too, my friend” while covering the other individual’s face with a cloth as they struggle. The caption accompanying the post states, “I’m sorry my friend” accompanied by a sad emoji.

The user who reported this post explains that jokes about rape are “not funny” and that “men are less likely to report they have been raped and it’s because of these kinds of images.”

Meta’s Adult Sexual Exploitation policy explicitly prohibits “content depicting, advocating for or mocking non-consensual sexual touching” including “[c]ontent mocking survivors or the concept of non-consensual sexual touching.” Lack of consent is determined by Meta through context, including verbal expressions, physical gestures, or incapacitation.

After the Board brought this case to Meta’s attention, the company determined that the content violated the Adult Sexual Exploitation policy and that its original decision to leave up the content was incorrect. The company then removed the content from Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case illustrates shortcomings in the enforcement of Meta’s Adult Sexual Exploitation policy. Over many years, civil society groups have repeatedly raised concerns about under-enforcement of Meta’s policies as applied to material that jokes about rape or mocks victims and survivors of sexual violence. The Board has previously addressed the difficulties inherent in accurately moderating jokes and attempts at humor. In the Two Buttons Meme decision, the Board stressed the importance of carefully evaluating the content and context of apparent jokes when Meta assesses posts. While the Two Buttons Meme decision dealt with satirical content that was wrongly removed, this case illustrates the mistakes made when posts expressed as jokes are not taken sufficiently seriously and are wrongly left up on the platform. As Meta now agrees, this post, which relies on a homophobic premise and mocks violent sexual assault, clearly violates the Adult Sexual Exploitation policy. Given the high likelihood of mistakes in these types of cases, the Board has recommended that Meta ensure that its processes appropriately include sufficient opportunities for “investigation or escalation where a content moderator is not sure if a meme is satirical or not” ( Two Buttons Meme decision, recommendation no. 3). Meta reported implementation of this recommendation but has not published information to demonstrate it.

The Board has issued recommendations aimed at reducing the number of enforcement errors made by Meta. The Board urged Meta to “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta reframed the recommendation in their response and implementation and did not address the goal of the Board’s recommendation. The Board has also repeatedly stressed the importance of Meta devoting extra resources to improve its ability to accurately assess potentially harmful content that either criticizes or legitimizes systemic problems including gendered and sexual violence in cases such as those addressed in the India Sexual Harassment and Image of Gender-Based Violence decisions.

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions