A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.

Oversight Board announces new cases related to Colombia, Afghanistan, and Ethiopia


May 2022

Today, the Board is announcing three new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case selectionCase selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.

The cases we are announcing today are:

Colombian police cartoon (2022-004-FB-UA)

User appeal to restore content to Facebook

Submit public comment here.

In September 2020, a Facebook user in Colombia posted a picture of a cartoon as a comment on another user’s post. The cartoon resembles the official crest of the National Police of Colombia and depicts three figures wearing police uniforms and holding batons over their heads. The figures appear to be kicking and beating another figure who is lying on the ground with blood beneath their head. A book and a pencil are shown next to the figure on the ground. The text on the crest reads in Spanish, “República de Colombia - Policía Nacional - Bolillo y Pata,” which Meta’s regional markets team translated to “National Police – Republic of Colombia – Baton and Kick.” At the time the content was posted, there had recently been protests in Colombia against police violence.

According to Meta, in January 2022, 16 months after the content was originally posted to Facebook, the company removed the content as it matched with a picture in its “media matching bank” of content violating Facebook’s Dangerous Individuals and Organizations Community Standard. Meta's mention of a “media matching bank” seems to refer to a system which helps Meta find duplicates of harmful media content and prevent them being shared. The Board is currently seeking more information from Meta about how the “media matching bank” is created, controlled, and used.

The user appealed this decision, and Meta maintained its decision to remove the content, but upheld the decision based on a different Facebook Community Standard, the Violence and Incitement Community Standard. At the time of removal, the content had received three views and no reactions. No users reported the content.

In their statement to the Board, the user primarily expresses confusion about how their content violated Meta’s policies. They describe the content as reflecting reality in Colombia.

As a result of the Board selecting this case, Meta identified the removal of the content as an “enforcement error” and restored it. Meta explained to the Board that the removal decisions were wrong as the content did not contain a reference to any dangerous individual or organization, nor did it contain a threat of violence or statement of intent to commit violence. Meta also confirmed that, while they found the Violent and Graphic Content Community Standard relevant, the company did not regard the content to have violated this standard as “fictional imagery” is not prohibited.

The Board would appreciate public comments that address:

  • Whether Meta’s policies on Dangerous Individuals and Organizations, Violence and Incitement, and Violent and Graphic Content sufficiently respect expressions of political dissent, including against state/police violence.
  • How Meta’s use of “media matching bank” and automation could be improved to avoid the removal of non-violating content and enhance detection of violating content.
  • Insights on the socio-political context in Colombia, particularly regarding the restriction of information and discussion on social media of protests and criticism of police violence.
  • Insights into the role of social media globally in criticizing or documenting instances of police violence.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Mention of the Taliban in news reporting (2022-005-FB-UA)

User appeal to restore content to Facebook

Submit public comment here.

In January 2022, the Facebook page of a news outlet in India shared a post with a link to an article on the news outlet’s website. The post, in Urdu, contains text and an external link that leads to an article on the news outlet’s website. The text states that Zabiullah Mujahid (the Taliban government in Afghanistan’s Culture and Information Minister and official central spokesman) said that the Afghan New Year begins on March 21 and that schools and colleges for girls and women will open this year from the beginning of the new year. The article discusses this announcement in further detail. The page has about 14,000 followers.

A user initiated a report on the content to Meta, but did not complete their report. This incomplete report triggered a classifier that scored the content as potentially violating under the Dangerous Individuals and Organizations (“DIO”) policy, sending the content for human review.

Meta removed this content for violating its Dangerous Individuals and Organizations Community Standard, having determined that it violated its prohibition on praising a designated terrorist group. Zabiullah Mujahid is a prominent member and spokesman for the Taliban, and the Taliban is a Tier 1 designated terrorist organization under Meta’s DIO policy.

According to the Facebook Community Standards, praise of a designated entity includes “[s]peak[ing] positively about a designated entity or event;” “[g]iv[ing] a designated entity or event a sense of achievement;” “[l]egitimizing the cause of a designated entity by making claims that their hateful, violent, or criminal conduct is legally, morally, or otherwise justified or acceptable;” and “[a]ligning oneself ideologically with a designated entity or event.” Meta states that it allows content which references dangerous individuals and organizations in the context of reporting on them, but “users must clearly indicate their intent when creating or sharing such content” and “if a user’s intention is ambiguous or unclear,” it will default to removing content.

The user who created the content appealed the removal, but Meta upheld its decision to remove the content. The user then appealed to the Oversight Board. When the Oversight Board brought the content to Meta’s attention, Meta determined that this was an enforcement error and that this content fell into the DIO policy exception for reporting and should not have been removed. Meta stated that it did not have any information on why this content was assessed twice as praise and not as news reporting.

In their statement to the Board, the user states that they are a media organization and do not support extremism. They say that their articles are based on national and international media sources and that this content was shared to provide information about women’s and girls’ education in Afghanistan.

The Board would appreciate public comments that address:

  • How Meta’s content moderation policies and practices affect public discourse about the Taliban’s role in Afghanistan.
  • How Meta’s content policies and practices on Dangerous Individuals and Organizations affect the ability of journalists to report on these groups.
  • The DIO policy prohibition on “praise” of Tier 1 and 2 designated individuals and entities and its compatibility with Meta’s human rights responsibilities.
  • What principles should guide if or when Meta should revoke or change the designation of an entity under the Dangerous Individuals and Organizations Community Standard, including for entities that form or take the place of governments.
  • Whether Facebook's Dangerous Individuals and Organizations Community Standard unnecessarily limits discussion of designated groups that either form or take the place of governments, including in relation to “false positive” removals of media reporting and other commentary on current affairs.
  • The relationship between US law prohibiting material support of designated terrorist organizations and Meta's content policies, and how this may affect freedom of expression globally.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. The Board welcomes public comments proposing recommendations that are relevant to this case.

Tigray Communication Affairs Bureau (2022-006-FB-MR)

Case referred by Meta

Submit public comment here.

On February 4, 2022, Meta referred a case to the Board concerning content posted on Facebook in November 2021, during ongoing armed conflict between the Tigray People’s Liberation Front (TPLF) and the Federal Democratic Republic of Ethiopia.

The content was posted in Amharic by the Tigray Communication Affairs Bureau page, which states that it is the official page of the Tigray Regional State (Ethiopia) Communication Affairs Bureau. The post discusses the losses suffered by the Federal National Defense Forces under the leadership of Prime Minister Abiy Ahmed. It goes on to say that the armed forces must surrender to the TPLF if they hope to save their lives, and if they refuse, they will die. The post also encourages the national army to “turn its gun” against the Prime Minister’s group in order to make amends with the people it has harmed (for more on the conflict see the Board’s prior decision in Case 2021-014-FB-UA). The page is set to public, meaning it can be viewed by any Facebook user, was verified and was previously subject to cross-check, but not at the time the content was posted and reviewed. Cross-check is a system that Meta claims helps ensure accurate enforcement through additional levels of human review (for more on cross-check see the Board’s announcement of the policy advisory opinion that is currently in-process). The page has about 260,000 followers.

The content was reported by 10 users for violating the Violence and Incitement, Dangerous Individuals and Organizations, and Hate Speech policies. In addition to the user reports, Meta’s automated systems also identified the content as potentially violating and lined it up for review. During Meta’s initial review, the company determined that the content was not violating and left it on the platform. Following another review initiated through the company’s crisis response system, Meta determined the content violated its Community Standard on Violence and Incitement and removed it.

Under its Violence and Incitement policy, Meta states that it will remove any content that “incites or facilitates serious violence.” The policy prohibits “threats that could lead to death (and other forms of high-severity violence) … targeting people or places.” The policy also states that for “coded statements” or “veiled or implicit” threats, the company will look to other signals to determine whether there is a threat of harm. These signals include whether the content was “shared in a retaliatory context” or if it “[references] historical or fictional incidents of violence,” among others.

In its referral of the case to the Board, Meta states that the decision regarding the content was difficult because it involves removing “official government speech that could be considered newsworthy,” but noted that it may pose a risk of inciting violence during an ongoing conflict. In its analysis, Meta also told the Board that the company took into account the documented atrocities committed during this conflict by all parties involved.

Following the referral of this case to the Board, the user was given the option to submit a statement to the Board. The Board has not received a statement from the user.

The Board would appreciate public comments that address:

  • How Meta enforces its Violence and Incitement policy in conflict situations, including whether its actions are consistent across different conflicts.
  • Whether credible threats of violence made between parties during an armed conflict should be treated differently under Meta’s policies and under what circumstances.
  • When content that violates Meta’s policies should be allowed under the “newsworthiness” allowance in conflict situations, noting that that allowance was not applied in this case because, according to the company, the allowance does not apply to content that presents a risk of contributing to physical harm.
  • Whether Meta should allow content that violates its Violence and Incitement Community Standard if the actions threatened, incited, or instigated are permitted under international humanitarian law (also known as the law of armed conflict).
  • Whether and how Meta’s cross-check program should work during an armed conflict.
  • Content moderation challenges specific to Ethiopia and languages spoken in the country, particularly during times of heightened tension or conflict.
  • Evidence or analysis of statements from armed groups or the military in Ethiopia on social media that have incited or instigated violence, including any violations or abuses of international law.
  • The information environment in Ethiopia during the conflict, including access to the internet and independent sources of reporting, and how this should influence Meta's approach to moderating content from parties to the conflict.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public commentsPublic comments

If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 15:00 UTC on Tuesday, May 24, 2022.

What’s nextWhat’s next

In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.

Back to news and articles