Announcing the Board’s Next Cases
September 16, 2021
Today, the Board is announcing three new cases for consideration.
Case Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook's policies.
The cases we are announcing today are:
2021-012-FB-UA
User appeal to restore content to Facebook
Submit public comment here.
In August 2021, a Facebook user posted a picture of Indigenous artwork, with accompanying text description in English. The picture shows a traditional wampum belt, made with shells or beads. The belt includes a series of depictions which the user says were inspired by “the Kamloops story,” a reference to the discovery of unmarked graves at a former residential school for First Nations children in British Columbia, Canada.
The text gives the artwork the title “Kill the Indian/ Save the Man,” identifies the user as its creator, and provides the phrases "Theft of the Innocent," "Evil posing as Saviors," "Residential School/Concentration Camp," "Waiting for Discovery" and "Bring Our Children Home." Each of these phrases appears to correspond to a distinct section of the series of depictions on the wampum belt.
The user states that wampum belts have “always been a means of documenting our history,” and that before colonization storytellers would travel from village to village “teaching our people our history.” The user states that their belts today are much the same but instead travel by social media and on display in exhibits. They say that the belt was not easy to create, and that it was very emotional bringing the story to life but also that it was too important to not document it. They further say that the belt will be one of a three belt set, and will not be for sale as this is a story that cannot be hidden from public knowledge again.
The user says that they made the belt after the news of the Kamloops story, a reference to the discovery of unmarked graves at a former residential school for First Nations children in Canada. In the post, the user apologizes for any pain their belts cause to survivors, and that this is not the intent – the “sole purpose is to bring awareness to this horrific story.”
Facebook removed the content under its Hate Speech Community Standard. As a result of the Board selecting this case, Facebook identified its removal as an “enforcement error” and restored the content – the content remains available on the platform. At the time of removal, the content had been viewed over 4,000 times, and shared over 50 times. No users reported the content.
Under its Hate Speech policy, Facebook takes down content that targets people with “violent speech” on the basis of a protected characteristic, including race, ethnicity and national origin. Indigenous origin or identity is not expressly listed as a protected characteristic. The policy includes the following exceptions: “We recognize that people sometimes share content that includes someone else’s hate speech to condemn it or raise awareness. In other cases, speech that might otherwise violate our standards can be used self-referentially or in an empowering way. Our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If the intention is unclear, we may remove content.”
In their appeal, the user states that they are a traditional artist sharing their artwork which is important to documenting history. They state that this is censorship and that it is important that people see what they posted.
The Board would appreciate public comments that address:
- Whether Facebook’s initial decision to remove the post was consistent with the company's Hate Speech Community Standard, the company's stated values and human rights responsibilities and commitments.
- Concerns related to Facebook's moderation of artistic expression, particularly art that may address sensitive themes.
- The history and use of the phrase “Kill the Indian/ Save the Man” in North America.
- Contextual information on human rights abuses against children of Indigenous origin or identity in residential schools in Canada, including First Nations children at the Kamloops Indian Residential School.
- If Indigenous origin or identity should be a protected characteristic in Facebook’s hate speech policy.
- How Facebook’s content moderation, including the use of automation, impacts the freedom of expression of Indigenous peoples, and how negative impacts may be prevented or mitigated.
In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
2021-013-IG-UA
User appeal to restore content to Instagram
Submit public comment here.
In July 2021, an Instagram account for a spiritual school based in Brazil posted a picture of a dark brown liquid in a jar and two bottles, described as ayahuasca in the accompanying text in Portuguese. Ayahuasca is a plant-based brew with psychoactive properties that has spiritual and ceremonial uses in some South American countries.
The text states that “AYAHUASCA IS FOR THOSE WHO HAVE THE COURAGE TO FACE THEMSELVES” and is followed by text about ayahuasca. The text includes statements that ayahuasca is for those who want to “correct themselves,” “enlighten,” “overcome fears,” and “break free.” It further states ayahuasca is a “remedy” and “can help you” if one has humility and respect. It states that ayahuasca shows the truth but does not work miracles. It ends with “Ayahuasca, Ayahuasca!/ Gratitude, Queen of the Jungle!”
The content was viewed over 15,500 times and no user reported it. Facebook removed the content for violating the Instagram Community Guidelines, which state: “Remember to always follow the law when offering to sell or buy other regulated goods” and link to Facebook’s Community Standard on Regulated Goods. The Regulated Goods policy prohibits content related to “non-medical drugs” which “admits to personal use without acknowledgment of or reference to recovery, treatment, or other assistance to combat usage” or “coordinates or promotes (by which we mean speaks positively about, encourages the use of, or provides instructions to use or make) non-medical drugs.”
The user states in their appeal that they are certain the post does not violate Instagram’s Community Guidelines, as their page is informative and never encouraged or recommended the purchase or sale of any product prohibited by the Community Guidelines. They say that they took the photo at one of their ceremonies, which are regulated and legal. According to the user, the account aims to demystify the sacred ayahuasca drink. They say that there is a great lack of knowledge about ayahuasca. The user states that it brings spiritual comfort to people and their ceremonies can improve societal wellbeing. They further state that they have posted the same content previously on their account and that post remains online.
The Board would appreciate public comments that address:
- Whether Facebook’s decision to remove the post is consistent with the Instagram Community Guidelines, specifically the reminder to “follow the law” regarding the sale or purchase of regulated products, and Facebook’s Community Standard on Regulated Goods, specifically the rules on speaking positively about, encouraging, or promoting non-medical drugs.
- Whether Facebook's policies on the regulation of non-medical drugs should take into account different legal approaches at the national level, or provide a different rule for positive discussion of non-medical drugs in the context of a religious or spiritual practice. The clarity of the relationship between Instagram’s Community Guidelines and Facebook’s Community Standards, including in relation to regulated goods.
- Whether Facebook’s decision to remove the post is consistent with the company's stated values and human rights responsibilities and commitments, including in relation to freedom of expression and freedom of religion or belief.
- Information on the use and significance of ayahuasca, including in ceremonial or religious contexts by different groups in South America.
- Information on how ayahuasca use may affect physical and mental health, and/or people’s safety.
In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
2021-014-FB-UA
User appeal to restore content to Facebook
Submit public comment here.
In late July 2021, a Facebook user posted in Amharic on their timeline claiming that the Tigray People’s Liberation Front (TPLF) killed and raped women and children, as well as looted the properties of civilians in Raya Kobo and other towns in Ethiopia’s Amhara Region. The user also claimed that ethnic Tigrayan civilians assisted the TPLF in these crimes.
The user begins by claiming that “people in the area” have reported that Raya Kobo is empty and had been looted prior to the “rebel group” TPLF entering it. They then allege that in Raya Kobo, Kobo City, Aradum, Menjelo, Robit and Gobye, children and married women were raped, and youngsters were shot dead. The user alleges that farmers were killed “by the terrorist group” in retaliation for saying “I don’t like the color of your eyes.” The user also claims that “mothers in the area” have no food, are atrociously treated and forced to feed the TPLF. The user also states that “the Tigray rebel group” announced that the Raya Kobo community members would be given an ID card indicating “Tigre.” The user adds that the TPLF “appealed to everyone they meet on the phone” informing them that the government has “sold” “the part of Amhara from Alawha and beyond.” The user claims that the TPLF brought and publicly executed “innocent youths” from Alamata, Korem and other parts of Raya Kobo. According to the user, they did this to appear credible in the Raya Kobo community.” The user states the TPLF lied and deceived the community by claiming that the executed youths were robbers who had followed them from Tigray and who did not speak Amharic. Further, the user claims that in “every town where the crowds live” the TPLF locates itself in every health center and school to “annihilate the people” in the event of an air strike. Lastly, the user states that they are receiving reports from people living in the area that “the Tigreans, who know the area very well” led the TPLF door-to-door to expose women to rape and to loot property. The user ends the post with the words “We will secure our freedom through our struggle.”
The user does not claim in the post to be an eye-witness to the events described. The post also does not attribute any of the claims or allegations it contains to named individuals, institutions, or media. It does not contain hyperlinks to external sources or include any images.
The post was viewed more than 6,500 times, receiving fewer than 35 comments and more than 140 reactions. It was shared over 30 times. According to Facebook, the user’s account that posted the content is located in Ethiopia, but not in the Tigray or Amhara regions. The user’s profile picture includes a hashtag signaling disapproval of the TPLF. The post remained on the platform for approximately one day. The post was automatically reported. Facebook then removed the post under its Hate Speech Community Standard. The Board does not know at this stage whether the post was reviewed by a human moderator or automatically removed. As a result of the Board selecting this case, Facebook identified the post’s removal as an “enforcement error” and restored it.
Under the Tier 1 of Hate Speech, Facebook users are prohibited from posting content targeting a group of people with generalizations, or unqualified behavioral statements about them being violent, sexual or other criminals. The Hate Speech policy rationale states that Facebook recognizes “that people sometimes share content that includes someone else’s hate speech to condemn it or raise awareness” and that “policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If the intention is unclear, we may remove content.”
The policy rationale for Hate Speech was updated in June 2021 to clarify that the company “define[s] hate speech as a direct attack against people – rather than concepts or institutions – on the basis of what we call protected characteristics [...]" [emphasis added]. In the June 2021 policy update, Facebook provided a new rule under the title “For the following Community Standards, we require additional information and/or context to enforce.” The new rule prohibits content “attacking concepts, institutions, ideas, practices, or beliefs associated with protected characteristics, which are likely to contribute to imminent physical harm, intimidation or discrimination against the people associated with that protected characteristic. Facebook looks at a range of signs to determine whether there is a threat of harm in the content. These include but are not limited to: content that could incite imminent violence or intimidation; whether there is a period of heightened tension such as an election or ongoing conflict; and whether there is a recent history of violence against the targeted protected group. In some cases, we may also consider whether the speaker is a public figure or occupies a position of authority.”
Though the content in this case was in Amharic, the user submitted their appeal to the Board in English. In their statement to the Board, the user says that the post is intended to protect their community which is in danger and that Facebook must help communities in war zones. They state the post is not hate speech “but is truth.” They stated that the TPLF targeted their community of one million people and left them without food, water and other basic necessities. The user also speculated that their post was reported “by members and supporters of that terrorist group,” and claims to “know well most of the rules” and that they have “never broken any rules of Facebook.”
The Board would appreciate public comments that address:
- Whether Facebook’s initial decision to remove the post is consistent with the company’s Hate Speech Community Standard, the company's stated values and human rights responsibilities and commitments.
- Whether Facebook’s Hate Speech policy adequately enables users to raise awareness of alleged human rights violations, including in conflict zones.
- Content moderation challenges specific to Ethiopia and languages spoken in the country, both in terms of respecting freedom of expression and addressing harms that may result from hate speech, in particular during times of heightened tension or conflict.
- Recent trends relating to the use or spread of hate speech or misinformation by parties to the conflict in Ethiopian languages, as well as by civilians engaged in discussions around it, and the role of Facebook. It would be useful for comments to include socio-political and historical context for these trends, in particular on relations between ethnic groups present in the locations named in the post.
- The current role and composition of the TPLF in Ethiopia, including any distinction between its political and paramilitary functions, as well as the conduct of the TPLF or other groups during the conflict described in the post, in particular any alleged human rights violations or abuses in the places named in this content.
- Challenges researchers, journalists and investigators may face in documenting atrocity crimes through Facebook, including in collecting evidence for accountability, and any responsibility Facebook has to preserve content and ensure access to it.
In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 15:00 UTC on Thursday 30 September.
What’s Next
In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.