Overturned

Mistreatment by Ecuadorian Forces

A user appealed Meta’s decision to remove a Facebook video from Ecuador showing people being tied up, stepped on and beaten with a baton by individuals dressed in what appears to be military uniforms.

Type of Decision

Summary

Policies and Topics

Topic
अभिव्यक्ति की आज़ादी, Governments, Politics
Community Standard
Hate speech, Violence and incitement, Violent and graphic content

Region/Countries

Location
Ecuador

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the companys attention and include information about Metas acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Metas decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to remove a Facebook video from Ecuador showing people being tied up, stepped on and beaten with a baton by individuals dressed in what appears to be military uniforms. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post, also applying a “Mark as Sensitive” warning screen.

About the Case

In January 2024, a Facebook user posted a video showing a group of people tied up and lying face down on the ground, while several other people in camouflage clothing step on their necks and backs repeatedly, holding them in position and beating them with a baton. No person’s face is visible in the video. The audio includes someone saying “maricón,” which the Board in its Colombia Protests decision noted had been designated as a slur by Meta. The post also contains text in Spanish condemning the beating of “defenceless” and “unarmed” prisoners.

Around the time the content was posted, inmates rioted in jails in Ecuador, taking prison guards and administrative workers hostage. Ecuador’s government declared a state of emergency and imposed a curfew. The police and military then regained control of some of the prisons, with the army sharing images of hundreds of inmates, shirtless and barefoot, lying on the ground.

Meta initially removed the user’s post from Facebook under its Violence and Incitement Community Standard, which prohibits threats of violence, defined as “statements or visuals representing an intention, aspiration or call for violence against a target.”

When the Board brought this case to Meta’s attention, the company did not give reasons for why it had removed the content under the Violence and Incitement Community Standard. The company also assessed the content under the Hate Speech Community Standard. Meta explained that although the content contained a slur, this was allowed under the Hate Speech Community Standard as the content included “slurs or someone else’s hate speech in order to condemn the speech or report on it.”

Meta also explained that under the Violent and Graphic Content Community Standard, the company applies a “Mark as Sensitive” warning screen to “imagery depicting one or more persons subjected to violence and/or humiliating acts by one or more uniformed personnel doing a police function.” The company restored the content to Facebook and applied a “Mark as Sensitive” label to it.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case illustrates the challenges faced by Meta in enforcing its policies on Violence and Incitement, Violent and Graphic Content, and Hate Speech. These challenges are particularly difficult when dealing with documentation of violence or abuses during crisis situations for the purposes of raising awareness.

For the Violence and Incitement Community Standard, the Board recommended that Meta should “add to the public-facing language ... that the company interprets the policy to allow content containing statements with ‘neutral reference to a potential outcome of an action or an advisory warning,’ and content that ‘condemns or raises awareness of violent threats,’” ( Russian Poem, recommendation no. 1). This recommendation has been implemented. As of February 2024, Meta has updated this policy including a clarification that it does not prohibit threats when shared in awareness-raising or condemning contexts.

In terms of the Violent and Graphic Content Community Standard, the Board recommended that Meta should “notify Instagram users when a warning screen is applied to their content and provide the specific policy rationale for doing so,” ( Video After Nigeria Church Attack, recommendation no. 2). Meta reported progress on implementing this recommendation. In its Q4 2023 update on the Board, Meta stated: “Individuals using our platforms can anticipate receiving more comprehensive details about enforcement determinations and safety measures taken regarding their content, including the implementation of warning screens. Given that this is an integral component of our broader compliance initiative, we anticipate delivering a more comprehensive update later in 2024.”

Regarding Hate Speech, the Board recommended that Meta “revise the Hate Speech Community Standard to explicitly protect journalistic reporting on slurs, when such reporting, in particular in electoral contexts, does not create an atmosphere of exclusion and/or intimidation. This exception should be made public, and be separate from the ‘raising awareness’ and ‘condemning’ exceptions,” ( Political Dispute Ahead of Turkish Elections, recommendation no. 1). Progress has been reported in implementing this recommendation. It was also recommended that Meta should “develop and publicize clear criteria for content reviewers for escalating for additional review public interest content that potentially violates the Community Standards,” ( Colombia Protests, recommendation no. 3). Meta described this as work it already does but did not publish information to demonstrate implementation.

The Board believes that full implementation of these recommendations could contribute to decreasing the number of enforcement errors across the policies on Violence and Incitement, Violent and Graphic Content, and Hate Speech.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions