Oversight Board overturns Meta's original decision: Case 2021-012-FB-UA
On October 28, 2021, Facebook announced that it was changing its company name to Meta. In this text, Meta refers to the company, and Facebook continues to refer to the product and policies attached to the specific app.
The Oversight Board has overturned Meta’s original decision to remove a Facebook post from an Indigenous North American artist that was removed under Facebook’s Hate Speech Community Standard. The Board found that the content is covered by allowances to the Hate Speech policy as it is intended to raise awareness of historic crimes against Indigenous people in North America.
About the case
In August 2021, a Facebook user posted a picture of a wampum belt, along with an accompanying text description in English. A wampum belt is a North American Indigenous art form in which shells are woven together to form images, recording stories and agreements. This belt includes a series of depictions which the user says were inspired by “the Kamloops story,” a reference to the May 2021 discovery of unmarked graves at a former residential school for Indigenous children in British Columbia, Canada.
The text provides the artwork’s title, “Kill the Indian/ Save the Man,” and identifies the user as its creator. The user describes the series of images depicted on the belt: “Theft of the Innocent, Evil Posing as Saviours, Residential School / Concentration Camp, Waiting for Discovery, Bring Our Children Home.” In the post, the user describes the meaning of their artwork as well as the history of wampum belts and their purpose as a means of education. The user states that the belt was not easy to create and that it was emotional to tell the story of what happened at Kamloops. They apologize for any pain the art causes survivors of Kamloops, noting their “sole purpose is to bring awareness to this horrific story.”
Meta’s automated systems identified the content as potentially violating Facebook’s Hate Speech Community Standard the day after it was posted. A human reviewer assessed the content as violating and removed it that same day. The user appealed against that decision to Meta prompting a second human review which also assessed the content as violating. At the time of removal, the content had been viewed over 4,000 times, and shared over 50 times. No users reported the content.
As a result of the Board selecting this case, Meta identified its removal as an “enforcement error” and restored the content on August 27. However, Meta did not notify the user of the restoration until September 30, two days after the Board asked Meta for the contents of its messaging to the user. Meta explained the late messaging was a result of human error.
Meta agrees that its original decision to remove this content was against Facebook’s Community Standards and an "enforcement error.” The Board finds this content is a clear example of ‘counter speech’ where hate speech is referenced to resist oppression and discrimination.
The introduction to Facebook’s Hate Speech policy explains that counter speech is permitted where the user’s intent is clearly indicated. It is apparent from the content of the post that it is not hate speech. The artwork tells the story of what happened at Kamloops, and the accompanying narrative explains its significance. While the words ‘Kill the Indian’ could, in isolation, constitute hate speech, in context this phrase draws attention to and condemns specific acts of hatred and discrimination.
The Board recalls its decision 2020-005-FB-UA in a case involving a quote from a Nazi official. That case provides similar lessons on how intent can be assessed through indicators other than direct statements, such as the content and meaning of a quote, the timing and country of the post, and the substance of reactions and comments on the post.
In this case, the Board found that it was not necessary for the user to expressly state that they were raising awareness for the post to be recognized as counter speech. The Board noted internal “Known Questions” to moderators that a clear statement of intent will not always be sufficient to change the meaning of a post that constitutes hate speech. Moderators are expected to make inferences from content to assess intent, and not rely solely on explicit statements.
Two separate moderators concluded that this post constituted hate speech. Meta was not able to provide specific reasons why this error occurred twice.
The Oversight Board’s decision
The Oversight Board overturns Meta's original decision to take down the content.
In a policy advisory statement, the Board recommends that Meta:
- Provide users with timely and accurate notice of any company action being taken on the content their appeal relates to. Where applicable, including in enforcement error cases like this one, the notice to the user should acknowledge that the action was a result of the Oversight Board’s review process.
- Study the impacts on reviewer accuracy when content moderators are informed they are engaged in secondary review, so they know the initial determination was contested.
- Conduct a reviewer accuracy assessment focused on Hate Speech policy allowances that cover artistic expression and expression about human rights violations (e.g., condemnation, awareness raising, self-referential use, empowering use). This assessment should also specifically investigate how the location of a reviewer impacts the ability of moderators to accurately assess hate speech and counter speech from the same or different regions. Meta should share the results of this assessment with the Board, including how results will inform improvements to enforcement operations and policy development and whether it plans to run regular reviewer accuracy assessments on these allowances. The Board also calls on Meta to publicly share summaries of the results of these assessments in its quarterly transparency updates on the Board to demonstrate it has complied with this recommendation.
For further information:
To read the full case decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.