Announcing the Board’s Next Cases
March 15, 2022
Today, the Board is announcing three new cases for consideration: a cartoon related to Croatia, a graphic video depicting a civilian victim of violence in Sudan, and a post where the user attempts to reclaim certain Arabic words.
Case Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.
The cases we are announcing today are:
Knin cartoon (2022-001-FB-UA)
User appeal to remove content from Facebook
Submit public comment here.
In early December 2021, a public Facebook page describing itself as a news portal for Croatia posted a video and caption in Croatian. Meta translated the caption as “The Player from Čavoglave and the rats from Knin.” Čavoglave is a village in Croatia, and Knin is a city in the country. The video is an edited version of Disney’s cartoon “The Pied Piper.” It is two minutes and 10 seconds long, with a voiceover in Croatian. The video is overlaid with the word “pretjerivač,” which seems to refer to an online platform of the same name where users share videos and other types of content.
The video portrays a city overrun by rats. While the entrance to the city in the original Disney cartoon is labeled as “Hamelin,” the city in the edited video is labelled as the Croatian city of “Knin.” At the start of the video, a narrator describes how rats and humans lived in the royal city of Knin for many years. The narrator continues that the rats decided they want to live in a “pure rat country,” so they started harassing and persecuting people living in the city. The narrator explains that when rats took over the city, a piper from the Croatian village of Čavoglave appeared. Initially, the rats did not take the piper seriously and continued with “the great rat aggression.” However, after the piper started to play a melody with his “magic flute,” the rats, captivated by the melody, started to sing “their favorite song” and followed the piper out of the city.
Meta told the Board that the melody in the cartoon is from a well-known Western Balkans’ folk song. The company translated the lyrics of the song sang by rats as: “What is that thing shining on Dinara, Dujić’s cockade on his head [...] Freedom will rise from Dinara, it will be brought by Momčilo the warlord.” Meta told the Board that the lyrics are from a song dedicated to Momčilo Dujić, “a famous Serbian warlord during the Second World War.” The video then portrays the city’s people closing the gate behind the piper and the rats. The video ends with the piper herding the rats into a tractor, which then disappears. The narrator concludes that once the piper lured all the rats into the “magical tractor,” the rats “disappeared forever from these lands” and “everyone lived happily ever after.” The page sharing the content has over 50,000 followers. While on the platform, the content was viewed over 380,000 times, shared over 540 times, and received over 2,400 reactions and over 1,200 comments. The majority of the users who reacted to, commented on, or shared the content have accounts located in Croatia. The remaining users have their accounts located in Germany and Bosnia and Herzegovina.
The content was reported over 390 times. Of those reporters, 362 reported the content for hate speech. This appeal to the Board is based on the report of one of these users, whose account appears to be located in Serbia. Based on prior consistent human review decisions, Meta determined that the content was not in violation of the Facebook Community Standards and did not remove it. Meta said that it uses automation when subsequent reports occur on content where it may have already made a prior decision, to avoid re-reviewing content it has already reviewed. After the user who reported the content appealed against Meta’s decision, Meta conducted an additional human review and upheld the original decision to keep the content on the platform.
The user who reported the content submitted their appeal to the Board in Serbian. They begin by stating that the flute player represents “the Croatian Army, which persecuted Serbs from Croatia.” They also state that the rats represent Serbs. According to the user who reported the content, Meta did not assess the video correctly. They add that the video “brings national and religious hatred” to the Balkans and beyond. They also state that “this portal” spreads “national intolerance between the two nations that barely healed the wounds.”
In late January 2022, as a consequence of the Board selecting the case, Meta identified its decision to keep the content on the platform as an “enforcement error” and removed it as a violation of the Hate Speech policy. Meta explained that its original at-scale decision was based on a limited amount of context available to the content reviewer.
Under its Hate Speech Community Standard, Meta takes down content targeting a person or group of people based on their race, ethnicity and/or national origin with “dehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form) to or about: [a]nimals that are culturally perceived as intellectually or physically inferior.” Meta told the Board that in light of the historical context in the case, the content contains a direct attack against Serbians by comparing them to rats. Meta concluded that the content creates an environment of intimidation and exclusion and that it may also promote real world harm.
In their statement to the Board, the user who posted the content stated that they are “not sure about the content” and that they are part of the page “only as business associate for advertising purposes.” Meta does not consider the user who posted the content to be a public figure.
The Board would appreciate public comments that address:
- How enforcement of Meta’s content policies on hate speech should take into account local context in Southeast Europe, specifically Croatia, and the history of conflict in this region.
- How to improve human review of complex content, in particular in the form of video.
- The nature of Meta’s compliance with its human rights responsibilities in relation to avoiding or mitigating adverse human rights impacts resulting from hate speech on its platforms in Southeast Europe, specifically Croatia, and its commitments to respect freedom of expression.
- Historical, social, political, and cultural context in Croatia, Serbia, Bosnia and Herzegovina regarding the nature, prevalence, and impact of hate speech in these contexts, including in diaspora communities.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. The Board welcomes public comments proposing recommendations that are relevant to this case.
Sudan graphic video (2022-002-FB-MR)
Case referred by Meta
Submit public comment here.
On December 21, 2021, Meta referred a case to the Board concerning a graphic video depicting a civilian victim of violence in Sudan. The content was posted to the user’s Facebook profile page following a military coup in the country on October 25, 2021 and the start of protests against the military takeover of the government. The protests have been met with violence, with journalists and activists attacked and arrested by the security forces.
The video shows a person, possibly a minor, with a significant head wound lying next to a car. Voices can be heard saying in Arabic that someone has been beaten and left in the street. The post includes a caption, also in Arabic, calling on the people to stand together and not to trust the military, with numerous hashtags including #DocumentingMilitaryAbuses and #CivilDisobedience. The post was viewed fewer than 1,000 times and no users reported the content.
Meta’s automated systems identified the content as potentially violating and, following review, removed the content for violating the Violent and Graphic Content Community Standard. The user appealed Meta’s decision to remove the post. Meta reviewed the post again and applied the newsworthiness allowance to restore the post. When Meta restored the post, it placed a warning screen on the video marking it as sensitive and requiring users to click through to view the content. The warning screen prohibits users under the age of 18 years from viewing the video.
Under its Violent and Graphic Content policy, Meta states that it removes any content that “glorifies violence or celebrates suffering” but allows graphic content “to help people raise awareness.” The policy prohibits posting “videos of people or dead bodies in non-medical settings if they depict dismemberment.” According to its newsworthiness allowance, Meta allows violating content on its platforms “if keeping it visible is in the public interest.”
In its referral, Meta states that the decision on this content is difficult because it highlights the tension between the public interest value of documenting human rights violations and the risk of harm associated with sharing such graphic content. Meta also highlighted the importance of allowing users to document human rights violations during a coup and when internet access in the country has been shut down.
The Board has not received a statement from the user responsible for the content.
The Board would appreciate public comments that address:
- Whether Meta’s policies on violent and graphic content provide sufficient protection of users documenting or raising awareness of human rights violations.
- Meta’s compliance with its human rights responsibilities around moderation of expression containing graphic and violent content, including whether the rights of all victims are equally protected and whether it sufficiently protects the rights of traumatized survivors and relatives or loved ones of depicted victims.
- Meta’s moderation of violent and graphic content during periods of crisis, mass protests or internet shutdowns and what factors Meta should consider when determining whether to remove or apply warning screens and age-gating to such content.
- How the use of a warning screen on graphic content, including to restrict access to minors, may impact the rights of Facebook users (e.g. to raise awareness and document abuses, right to privacy, physical integrity, physical and mental health)?
- How Meta’s content moderation, including the use of automation, impacts freedom of expression and documentation of human rights violations during a conflict, and how negative impacts may be prevented or mitigated.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Reclaiming Arabic words (2022-003-IG-UA)
User appeal to restore content to Instagram
Submit public comment here.
Note: To allow people to provide comments on the nature and impact of the post and help people understand the Board's eventual ruling in this case, we are sharing some of the exact words used in this post. We do so in the interest of transparency, while recognizing that some of the quoted language has the potential to offend.
In November 2021, an Instagram account that identifies itself as a space for discussing queer narratives in the Arabic culture posted a series of pictures in a carousel (a single Instagram post that can contain up to ten images with a single caption). The caption explains that each picture shows a different word that can be used in a derogatory way towards men with “effeminate mannerisms” in the Arabic world, including the terms “zamel,” “foufou,” and “tante”/“tanta.” The caption, written in both Arabic and English, stated that the user did not “condone or encourage the use of these words.” The user explained in the post that they had been abused with one of these terms when they were a child and that the post was intended “to reclaim [the] power of such hurtful terms.” The content was viewed approximately 9,000 times, receiving around 30 comments and approximately 2,000 reactions.
Within three hours of the content being posted a user reported it as “adult nudity or sexual activity” and another user reported it as “sexual solicitation.” After reviewing each of these reports separately, Meta removed the content for violating its Hate Speech policy. The user appealed and Meta restored the content to the platform. After the content was restored a third person reported it as hate speech and Meta carried out a fourth review, removing the content again. The user appealed a second time and, after a fifth review, Meta upheld its decision to remove the content.
In its statement to the Board, Meta explained that it originally removed the content under their Hate Speech policy as “zamel” (زامل) is regarded as a “derogatory term for gay people," which the company had designated as a slur for its "Arabic” and “Maghreb” markets at the time the content was taken down. Following an audit of the use of the word, on February 23, 2022, Meta removed the word from the “Arabic slur list” and kept it in the “slur list for the Maghreb region.” Meta performed an additional sixth review of the content and determined that it did not violate the Hate Speech policy. Meta explained that the removal was wrong because “the use of the slur fell within Meta’s allowance for content that condemns a slur or hate speech, discusses the use of slurs including reports of instances when they have been used, or debates about whether they are acceptable to use.” All six reviews were carried out by human content moderators.
In their appeal to the Board, the user states that their intent in posting the content was to celebrate effeminate men and boys in Arab society who are often belittled through the use of derogatory language. The user further explained that they are attempting to reclaim derogatory words used against them as a form of resistance and empowerment. They stated that their content is allowed under Meta’s content policies which specifically permit the use of otherwise banned terms when used self-referentially or in an empowering way.
The Board would appreciate public comments that address:
- How the Instagram Community Guidelines and Facebook Community Standard on Hate Speech, especially the rules on slurs, can best protect LGBTQ+ people from attacks using derogatory slurs, while also allowing LGBTQ+ people to engage in counter speech that may use the same slurs.
- The policy requirement for users to “clearly state their intent” if using hate speech terms to condemn, raise awareness or to empower, and if or how other contextual factors should be considered when enforcing this exception.
- Meta’s compliance with its human rights responsibilities in respect of Arabic speaking users of its products who are LGBTQ+, especially those located in North Africa and West Asia.
- Challenges and risks to LGBTQ+ people exercising their freedom of expression rights on Meta’s products in North Africa and West Asia.
- Any improvements to Meta’s products and approach to content moderation that would enhance the protection of rights for LGBTQ+ people on Meta’s platforms, including in respect of mass-reporting (also known as “brigading”).
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 15:00 GMT on Tuesday, March 29, 2022.
What’s Next
In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.