A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board upholds Meta's decision in "Tigray Communication Affairs Bureau" case (2022-006-FB-MR)


October 2022

The Oversight Board has upheld Meta’s decision to remove a post threatening violence in the conflict in Ethiopia. The content violated Meta's Violence and Incitement Community Standard and removing it is in line with the company's human rights responsibilities. Overall, the Board found that Meta must do more to meet its human rights responsibilities in conflict situations and makes policy recommendations to address this.

About the case

On February 4, 2022, Meta referred a case to the Board concerning content posted on Facebook during a period of escalating violence in the conflict in Ethiopia, where Tigrayan and government forces have been fighting since November 2020.

The post appeared on the official page of the Tigray Regional State’s Communication Affairs Bureau and was viewed more than 300,000 times. It discusses the losses suffered by federal forces and encourages the national army to “turn its gun” towards the “Abiy Ahmed group.” Abiy Ahmed is Ethiopia’s Prime Minister. The post also urges government forces to surrender and says they will die if they refuse.

After being reported by users and identified by Meta’s automated systems, the content was assessed by two Amharic-speaking reviewers. They determined that the post did not violate Meta’s policies and left it on the platform.

At the time, Meta was operating an Integrity Product Operations Centre (IPOC) for Ethiopia. IPOCs are used by Meta to improve moderation in high-risk situations. They operate for a short time (days or weeks) and bring together experts to monitor Meta's platforms and address any abuse. Through the IPOC, the post was sent for expert review, found to violate Meta’s Violence and Incitement policy, and removed two days later.

Key findings

The Board agrees with Meta’s decision to remove the post from Facebook.

The conflict in Ethiopia has been marked by sectarian violence, and violations of international law. In this context, and given the profile and reach of the page, there is a high-risk the post could have led to further violence.

As a result, the Board agrees that removing the post is required by Meta’s Violence and Incitement Community Standard, which prohibits “statements of intent to commit high-severity violence.” The removal also aligns with Meta’s values; given the circumstances, the values of “Safety” and “Dignity” prevail over “Voice.” The Board also finds that removal of the post aligns with Meta’s human rights responsibilities and is a justifiable restriction on freedom of expression.

Meta has long been aware that its platforms have been used to spread hate speech and fuel violence in conflict. The company has taken positive steps to improve content moderation in some conflict zones. Overall however, the Board finds that Meta has a human rights responsibility to establish a principled, transparent system for moderating content in conflict zones to reduce the risk of its platforms being used to incite violence or violations of international law. It must do more to meet that responsibility.

For example, Meta provides insufficient information on how it implements its Violence and Incitement policy in armed conflict situations, what policy exceptions are available or how they are used. Its current approach to content moderation in conflict zones suggests inconsistency; observers have accused the company of treating the Russia-Ukraine conflict differently to others.

While Meta says it compiles a register of “at-risk” countries, which guides its allocation of resources, it does not provide enough information for the Board to evaluate the fairness or efficacy of this process. The IPOC in this case led to the content being removed. However, it remained on the platform for two days. This suggests that the “at-risk” system and IPOCs are inadequate to deal with conflict situations. According to Meta, IPOCs are "not intended to be a sustainable, long-term solution to dealing with a years-long conflict.” The Board finds Meta may need to invest in a more sustained mechanism.

The Oversight Board’s decision

The Oversight Board upholds Meta’s decision to remove the post.

The Board also makes the following recommendations:

  • In line with the Board’s recommendation in the “Former President Trump’s Suspension,” as reiterated in the “Sudan Graphic Video,” Meta should publish information on its Crisis Policy Protocol. The Board will consider this recommendation implemented when information on the Crisis Policy Protocol is available in the Transparency Center, within six months of this decision being published, as a separate policy in the Transparency Center in addition to the Public Policy Forum slide deck.
  • Meta should assess the feasibility of establishing a sustained internal mechanism that provides it with the expertise, capacity and coordination required to review and respond to content effectively for the duration of a conflict.

*In the interest of making this decision accessible to those most affected by the content under review, the Board chose to delay its publication until the Amharic translation was available. This contributed to the decision significantly exceeding the 90 day deadline from announcement to publication.

For further information

To read the full decision click here.

To read a synopsis of public comments for this case, please click the attachment below.

Attachments

"Tigray Communication Affairs Bureau" public comments
Download
Back to news and articles