Overturned
Fictional Assault on Gay Couple
A user appealed Meta’s decision to leave up a Facebook post that depicts a fictional physical assault on a gay couple who are holding hands, followed by a caption containing calls to violence.
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.
Case Summary
A user appealed Meta’s decision to leave up a Facebook post that depicts a fictional physical assault on a gay couple who are holding hands, followed by a caption containing calls to violence. This case highlights errors in Meta’s enforcement of its Hate Speech policy. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.
Case Description and Background
In July 2023, a Facebook user posted a 30-second video clip, which appears to be scripted and produced with actors, showing a gay couple being beaten and kicked by people. The video then shows another group of individuals dressed in religious attire approaching the fight. After a few seconds, this group joins in, also assaulting the couple. The video ends with the sentence in English: “Do your part this pride month.” The accompanying caption, also in English, states, “Together we can change the world.” The post was viewed approximately 200,000 times and reported fewer than 50 times.
According to Meta: “Our Hate Speech policy prohibits calls to action and statements supporting or advocating harm against people based on a protected characteristic, including sexual orientation.” The post’s video and caption endorse violence against a protected characteristic, which is clearly depicted through visuals of two men holding hands and references to Pride month. Therefore, the content violates Meta’s Hate Speech policy.
Meta initially left the content on Facebook. After the Board brought this case to Meta’s attention, the company determined that the content did violate its Community Standards and removed the content.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the person who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.
Case Significance
This case highlights errors in Meta’s enforcement of its Hate Speech policy. The content in this case contained multiple indicators that the user was advocating the use of violence against a protected-characteristic group, in its visual depictions of a physical assault against two men holding hands. The text at the end of the video encourages users to “do their part” during Pride month. Moderation errors like this one can negatively impact the protected-characteristic group. The Board notes that this content was reported multiple times during a month that is meant to celebrate LGBTQIA+ people and, as such, there should have been heightened awareness and more robust content-moderation processes in place.
Previously, the Board has issued a recommendation on improving the enforcement of Meta’s Hate Speech policy. Specifically, the Board recommended that “Meta should clarify the Hate Speech Community Standard and the guidance provided to reviewers, explaining that even implicit references to protected groups are prohibited by the policy when the reference would reasonably be understood,” ( Knin Cartoon decision, recommendation no. 1). Partial implementation of this recommendation by Meta has been demonstrated through published information. The Board highlights this recommendation again and urges Meta to address these concerns to reduce the error rate in moderating hate speech content.
Decision
The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board also urges Meta to speed up the implementation of still-open recommendations to reduce such errors.