A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board upholds Meta's original decision: Case 2021-014-FB-UA


December 2021

On October 28, 2021, Facebook announced that it was changing its company name to Meta. In this text, Meta refers to the company, and Facebook continues to refer to the product and policies attached to the specific app.

The Oversight Board has upheld Meta’s original decision to remove a post alleging the involvement of ethnic Tigrayan civilians in atrocities in Ethiopia’s Amhara region. However, as Meta restored the post after the user’s appeal to the Board, the company must once again remove the content from the platform.

About the case

In late July 2021, a Facebook user from Ethiopia posted in Amharic. The post included allegations that the Tigray People’s Liberation Front (TPLF) killed and raped women and children, and looted the properties of civilians in Raya Kobo and other towns in Ethiopia’s Amhara region. The user also claimed that ethnic Tigrayan civilians assisted the TPLF with these atrocities. The user claims in the post that he received the information from the residents of Raya Kobo. The user ended the post with the following words “we will ensure our freedom through our struggle.”

After Meta’s automatic Amharic language systems flagged the post, a content moderator determined that the content violated Facebook’s Hate Speech Community Standard and removed it. When the user appealed this decision to Meta, a second content moderator confirmed that the post violated Facebook’s Community Standards. Both moderators belonged to Meta’s Amharic content review team.

The user then submitted an appeal to the Oversight Board. After the Board selected this case, Meta identified its original decision to remove the post as incorrect and restored it on August 27. Meta told the Board it usually notifies users that their content has been restored on the day they restore it. However, due to a human error, Meta informed this user that their post had been restored on September 30 – over a month later. This notification happened after the Board asked Meta whether it had informed the user that their content had been restored.

Key findings

The Board finds that the content violated Facebook’s Community Standard on Violence and Incitement.

While Meta initially removed the post for violating the Hate Speech Community Standard, the company restored the content after the Board selected the case, as Meta claimed the post did not target the Tigray ethnicity and the user’s allegations did not constitute hate speech. The Board finds this explanation for restoring the content to be lacking detail and incorrect.

Instead, the Board applied Facebook’s Violence and Incitement Community Standard to this post. This Standard prohibits “misinformation and unverifiable rumors that contribute to the risk of imminent violence or physical harm.” The Board finds that the content in this case contains an unverifiable rumor according to Meta’s definition of the term. While the user claims his sources are previous unnamed reports and people on-the-ground, he does not even provide circumstantial evidence to support his allegations. Rumors alleging that an ethnic group is complicit in mass atrocities, as found in this post, are dangerous and significantly increase the risk of imminent violence.

The Board also finds that removing the post is consistent with Meta’s human rights responsibilities as a business. Unverifiable rumors in a heated and ongoing conflict could lead to grave atrocities, as was the case in Myanmar. In decision 2020-003-FB-UA, the Board stated that “in situations of armed conflict in particular, the risk of hateful, dehumanizing expressions accumulating and spreading on a platform, leading to offline action impacting the right to security of person and potentially life, is especially pronounced.” Cumulative impact can amount to causation through a “gradual build-up of effect,” as happened in the Rwandan genocide.

The Board came to its decision aware of the tensions between protecting freedom of expression and reducing the threat of sectarian conflict. The Board is aware of civilian involvement in the atrocities in various parts of Ethiopia, though not in Raya Kobo, and the fact that Meta could not verify the post’s allegations at the time they were posted. The Board is also aware that true reports on atrocities can save lives in conflict zones, while unsubstantiated claims regarding civilian perpetrators are likely to heighten risks of near-term violence.

The Oversight Board’s decision

The Oversight Board upholds Meta’s original decision to remove the post. As Meta restored the content after the user’s appeal to the Board, the company must once again remove the content from the platform.

In a policy advisory statement, the Board recommends that Meta:

  • Rewrite its value of “Safety” to reflect that online speech may pose risk to the physical security of persons and the right to life, in addition to the risks of intimidation, exclusion and silencing.
  • Reflect in the Facebook Community Standards that in the contexts of war and violent conflict, unverified rumors pose higher risk to the rights of life and security of persons. This should be reflected at all levels of the moderation process.
  • Commission an independent human rights due diligence assessment on how Facebook and Instagram have been used to spread hate speech and unverified rumors that heighten the risk of violence in Ethiopia. The assessment should review the success of measures Meta took to prevent the misuse of its products and services in Ethiopia. The assessment should also review the success of measures Meta took to allow for corroborated and public interest reporting on human rights atrocities in Ethiopia. The assessment should review Meta’s language capabilities in Ethiopia and if they are adequate to protect the rights of its users. The assessment should cover a period from June 1, 2020, to the present. The company should complete the assessment within six months from the moment it responds to these recommendations. The assessment should be published in full.

For further information:

To read the full decision, click here.

To read a synopsis of public comments for this case, please click the attachment below.

Attachments

Public Comments 2021-014-FB-UA
Download
Back to news and articles