Upheld
Armenians in Azerbaijan
The Oversight Board has upheld Facebook's decision to remove a post containing a demeaning slur which violated Facebook's Community Standard on hate speech.
This decision is also available in Armenian, Azerbaijani and Russian.
Որոշման ամբողջական տարբերակը հայերենով կարդալու համար սեղմեք այստեղ.
Qərarın tam mətnini oxumaq üçün bura klikləyin.
Чтобы прочесть полное решение на русском языке, нажмите здесь.
Case Summary
The Oversight Board has upheld Facebook’s decision to remove a post containing a demeaning slur which violated Facebook’s Community Standard on Hate Speech.
About the case
In November 2020, a user posted content which included historical photos described as showing churches in Baku, Azerbaijan. The accompanying text in Russian claimed that Armenians built Baku and that this heritage, including the churches, has been destroyed. The user used the term “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed are nomads and have no history compared to Armenians.
The user included hashtags in the post calling for an end to Azerbaijani aggression and vandalism. Another hashtag called for the recognition of Artsakh, the Armenian name for the Nagorno-Karabakh region, which is at the center of the conflict between Armenia and Azerbaijan. The post received more than 45,000 views and was posted during the recent armed conflict between the two countries.
Key findings
Facebook removed the post for violating its Community Standard on Hate Speech, claiming the post used a slur to describe a group of people based on a protected characteristic (national origin).
The post used the term "тазики" (“taziks”) to describe Azerbaijanis. While this can be translated literally from Russian as “wash bowl,” it can also be understood as wordplay on the Russian word “азики” (“aziks”), a derogatory term for Azerbaijanis which features on Facebook’s internal list of slur terms. Independent linguistic analysis commissioned on behalf of the Board confirms Facebook’s understanding of "тазики" as a dehumanizing slur attacking national origin.
The context in which the term was used makes clear it was meant to dehumanize its target. As such, the Board believes that the post violated Facebook’s Community Standards.
The Board also found that Facebook’s decision to remove the content complied with the company’s values. While Facebook takes “Voice” as a paramount value, the company’s values also include “Safety” and “Dignity.”
From September to November 2020, fighting over the disputed territory of Nagorno-Karabakh resulted in the deaths of several thousand people, with the content in question being posted shortly before a ceasefire.
In light of the dehumanizing nature of the slur and the danger that such slurs can escalate into physical violence, Facebook was permitted in this instance to prioritize people's "Safety" and "Dignity" over the user's "Voice”.
A majority of the Board found that the removal of this post was consistent with international human rights standards on limiting freedom of expression.
The Board believed it is apparent to users that using the term “тазики” to describe Azerbaijanis would be classed as a dehumanizing label for a group belonging to a certain nationality, and that Facebook had a legitimate aim in removing the post.
The majority of the Board also viewed Facebook’s removal of the post as necessary and proportionate to protect the rights of others. Dehumanizing slurs can create an environment of discrimination and violence which can silence other users. During an armed conflict, the risks to people’s rights to equality, security of person and, potentially, life are especially pronounced.
While the majority of the Board found that these risks made Facebook’s response proportionate, a minority believed that Facebook’s action did not meet international standards and was not proportionate. A minority thought Facebook should have considered other enforcement measures besides removal.
The Oversight Board’s decision
The Board upholds Facebook’s decision to remove the content.
In a policy advisory statement, the Board recommends that Facebook:
- Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. In this case, the user was informed that the post violated Facebook’s Community Standard on Hate Speech but was not told that this was because the post contained a slur attacking national origin. Facebook’s lack of transparency left its decision open to the mistaken belief that the company removed the content because the user expressed a view it disagreed with.
*Case summaries provide an overview of the case and do not have precedential value.
Full Case Decision
1. Decision Summary
The Oversight Board has upheld Facebook’s decision to remove a user’s post about the alleged destruction of churches in Azerbaijan for violating the Community Standard on Hate Speech.
Independent analysis commissioned by the Board confirms Facebook’s assessment that the post contained a slur demeaning Azerbaijani national origin, which violates the Community Standards. Although the post contains political speech, Facebook was permitted to protect the safety and dignity of users by removing the post, especially in the context of an ongoing armed conflict between Armenia and Azerbaijan.
Removing the post was also consistent with international human rights standards, which permit certain tailored restrictions on expression aimed at protecting the rights of others.
The Board also advises Facebook to offer more detail on why posts have been removed to provide greater clarity and notice to users.
2. Case Description
In November 2020, a user posted content which included historical photos described as showing churches in Baku, Azerbaijan. The accompanying text, in Russian, claimed that Armenians built Baku and that this heritage, including the churches, has been destroyed. The user used the term “т.а.з.и.к.и” (“taziks”) to describe Azerbaijanis, who the user claimed are nomads and have no history compared to Armenians. “Tazik,” which means “wash bowl” in Russian, appears to have been used in the post as a play on “azik,” a derogatory term for Azerbaijanis.
The user included hashtags in the post calling for an end to Azerbaijani aggression and vandalism. Another hashtag called for the recognition of Artsakh, the Armenian name for the Nagorno-Karabakh region, which is at the center of the conflict between Armenia and Azerbaijan. The post received more than 45,000 views and was posted during the recent armed conflict between the two countries. Facebook removed the post for violating its Community Standard on Hate Speech. The user submitted a request for review to the Oversight Board.
3. Authority and Scope
The Board has the authority to review Facebook’s decision under Article 2 (Authority to Review) of the Board’s Charter and may uphold or reverse that decision under Article 3, Section 5 (Procedures for Review: Resolution) of the Charter. Facebook has not presented reasons for the content to be excluded in accordance with Article 2, Section 1.2.1 (Content Not Available for Board Review) of the Board’s Bylaws, nor has Facebook indicated that it considers the case to be ineligible under Article 2, Section 1.2.2 (Legal Obligations) of the Bylaws.
4. Relevant Standards
The Oversight Board considered the following standards in its decision:
I. Facebook’s Community Standards:
Facebook’s Community Standard on Hate Speech defines this as “a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability”. Prohibited content includes “content that describes or negatively targets people with slurs, where slurs are defined as words commonly used as insulting labels”.
Facebook’s policy rationale says that such speech is not allowed “because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence”.
II. Facebook’s Values:
The Facebook values relevant to this case are outlined in the introduction to the Community Standards. The first is "Voice”, which is described as “paramount”:
The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Facebook limits "Voice” in service of four other values. The Board considers that two of these values are relevant to this decision:
Safety:We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.
Dignity: We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.
III. Relevant Human Rights Standards considered by the Board:
The UN Guiding Principles on Business and Human Rights ( UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. The UN Working Group on Human Rights and Transnational Corporations, tasked with monitoring the implementation of the UNGPs, has addressed their applicability in conflict situations ( A/75/212, 2020). Drawing upon the UNGPs, the following international human rights standards were considered in this case:
- The right to freedom of expression: International Covenant on Civil and Political Rights ( ICCPR), Articles 19 and 20; as interpreted by General Comment No. 34, Human Rights Committee (2011) ( General Comment 34); UN Special Rapporteur on freedom of opinion and expression, reports: A/69/335 (2014); A/HRC/38/35 (2018); A/73/348 (2018), A/74/486 (2019) and A/HRC/44/49 (2020); and the Rabat Plan of Action, OHCHR, (2012);
- The right to non-discrimination: ICCPR Articles 2 and 26; International Convention on the Elimination of All Forms of Racial Discrimination ( ICERD), Articles 1, 4 and 5; as interpreted by the Committee on the Elimination of Racial Discrimination, General Recommendation No. 35 (2013) ( GR35);
- The right to life: ICCPR Article 6; as interpreted by General Comment No. 36, Human Rights Committee (2018) (GC36);
- The right to security of person: ICCPR Article 9, para. 1; as interpreted by General Comment No. 35, para. 9, Human Rights Committee (2014).
5. User Statement
In their statement to the Board, the user claimed that their post was not hate speech but was intended to demonstrate the destruction of Baku’s cultural and religious heritage. They also claimed that the post was only removed because Azerbaijani users who have “hate towards Armenia and Armenians” are reporting content posted by Armenians.
6. Explanation of Facebook’s Decision
Facebook removed the post for violating its Community Standard on Hate Speech, claiming the post used a slur to describe a person or group of people on the basis of a protected characteristic (national origin). Facebook stated that it removed the content for using the term "тазики" (“taziks”) to describe Azerbaijanis. This word can be translated literally from Russian as “wash bowl,” but can also be understood as wordplay on the Russian word “азики” (“aziks”) – Facebook explained to the Board that this word is on its internal list of slur terms, which it compiles after consultation with regional experts and civil society organizations. After assessing the whole post and the context in which it was made, Facebook determined that the user posted the slur to insult Azerbaijanis.
7. Third party submissions
The Oversight Board considered 35 public comments related to this case. Two of the comments submitted were from Central and South Asia, six from Europe, and 24 from the United States and Canada region.
The submissions covered the following themes: the use of slurs and derogatory language which violate the Community Standards; the factual accuracy of the post’s claims; whether the post constitutes legitimate political or historical discussion; and the importance of assessing the background situation and context, including the conflict in Nagorno-Karabakh.
8. Oversight Board Analysis
8.1 Compliance with Community Standards
The user’s post violated Facebook’s Community Standard on Hate Speech. This Community Standard explicitly prohibits the use of slurs based on ethnicity or national origin. The Board commissioned independent linguistic analysis which supports Facebook’s understanding of this term as a slur. The linguistic report confirms that the post implies a connection between “тазики,” or “wash basin,” and “азики,” a term often used to describe Azerbaijanis in a derogatory manner.
There may be instances in which words that are demeaning in one context might be more benign, or even empowering, in another. Facebook’s Community Standard on Hate Speech acknowledges that, in some cases, “words or terms that might otherwise violate [its] standards are used self-referentially or in an empowering way.” The context in which “тазики” was used in this post makes clear, however, that, in linking Azerbaijanis to wash bowls, it was meant to dehumanize its target.
8.2 Compliance with Facebook Values
The Board finds that the removal was consistent with Facebook’s values of “Safety” and “Dignity,” which in this case displaced the value of “Voice”.
Facebook’s values place a priority on “Voice” as users of the platform must be able to express themselves freely. Facebook’s values also, however, include “Safety” and “Dignity.” Speech that is otherwise protected may be restricted when leaving this content and other posts like it on the platform makes Facebook less safe, and, relatedly, undermines the dignity and equality of people. Facebook’s prohibition on the use of slurs targeting national origin is intended to prevent users from posting content meant to silence, exclude, harass, or degrade other users. Left up, an accumulation of such content may create an environment in which acts of discrimination and violence are more likely.
In this case, Facebook was permitted to treat the use of a slur as a serious interference with the values of “Safety” and “Dignity.” The conflict between Armenia and Azerbaijan, neighbors in the Southeast Caucasus, is of long standing. Most recently, from September to November 2020, fighting over the disputed territory of Nagorno-Karabakh resulted in the deaths of several thousand people. The content in question was posted to Facebook shortly before a ceasefire went into effect. This context was especially relevant for the Board. While pointed language may be a part of human interactions, particularly in conflict situations, the danger of dehumanizing slurs proliferating in a way that escalates into acts of violence is one that Facebook should take seriously.
8.3 Compliance with Human Rights Standards
- Freedom of Expression (Article 19 ICCPR)
Facebook has recognized its responsibilities to respect human rights under the UN Guiding Principles on Business and Human Rights and indicated that it looks to authorities like the ICCPR and the Rabat Plan of Action when making content decisions, including in situations of armed conflict.
The Board agrees with the UN Special Rapporteur on freedom of expression that although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression” (A/74/486, para. 41). Clarifying the nature of those questions and adjudicating whether Facebook’s answers fall within the zone of what the UN Guiding Principles require, is the principal task facing this Board.
The Board’s starting point is that the scope of the right to freedom of expression is broad. Indeed, Article 19, para. 2, of the ICCPR gives heightened protection to expression on political issues, and discussion of historical claims, including as they relate to religious sites and peoples’ cultural heritage. That protection remains even where those claims may be inaccurate or contested and even when they may cause offense. Article 19, para. 3, of the ICCPR requires limits on freedom of expression to satisfy the three-part test of legality, legitimacy, and necessity and proportionality.
A majority of the Board found that Facebook’s removal of this post from the platform met that test.
a. Legality
To satisfy the requirement of “legality,” any rule setting out a restriction on expression must be clear and accessible. Individuals must have enough information to determine if and how their speech may be limited, so that they can adjust their behavior accordingly. This requirement guards against arbitrary censorship (General Comment No. 34, para. 25).
Facebook’s Community Standard on Hate Speech specifies that “slurs” are prohibited, and that these are defined as “words that are inherently offensive and used as insulting labels” in relation to a number of "protected characteristics,” including ethnicity and national origin.
In this case, the Board considered the legality requirement to be satisfied. There may be situations where a slur has multiple meanings or can be deployed in ways that would not be considered an “attack.” In more contested situations, concepts of “inherently offensive” and “insulting” may be considered too subjective and raise concerns for legality (A/74/486, para. 46). The application of the rule in this case does not present that concern. The user’s choice of words fell squarely within the prohibition on dehumanizing speech, which the Board views as clearly stated and easily available to users. The use of “т.а.з.и.к.и”, connecting a national identity to an inanimate unclean object, plainly qualifies as an “insulting label.”
While the user’s subjective understanding of the rules is not determinative of legality, the Board notes that the user attempted to conceal the slur from Facebook’s automated detection tools by placing punctuation between each letter. This tends to confirm that the user was aware that they were using language that Facebook prohibits.
b. Legitimacy
Any restriction on freedom of expression should also pursue a “legitimate aim.” These aims are listed in the ICCPR, and include the aim of protecting “the rights of others” (General Comment No. 34, para. 28). Facebook’s prohibition on slurs seeks to protect people’s rights to equality and non-discrimination (Article 2, para. 1, ICCPR), to exercise their freedom of expression on the platform without being harassed or threatened (Article 19 ICCPR), to protect the right to security of person from foreseeable and intentional injury (Article 9, ICCPR, General Comment No. 35, para. 9), and even the right to life (Article 6 ICCPR).
c. Necessity and Proportionality
Necessity and proportionality require Facebook to show that its restriction on freedom of expression was necessary to address the threat, in this case the threat to the rights of others, and that it was not overly broad (General Comment No. 34, para. 34). The Board notes that international human rights law allows prohibitions on “insults, ridicule or slander of persons or groups or justification of hatred, contempt or discrimination” if such expression “clearly amounts to incitement to hatred or discrimination” on the grounds of race, colour, descent or national or ethnic origin (A/74/486, para. 17; GR35, para. 13).
Facebook’s Hate Speech Community Standard prohibits some discriminatory expression, including slurs, absent any requirement that the expression incite violent or discriminatory acts. While such prohibitions would raise concerns if imposed by a Government at a broader level (A/74/486, para. 48), particularly if enforced through criminal or civil sanctions, the Special Rapporteur indicates that entities engaged in content moderation like Facebook can regulate such speech:
The scale and complexity of addressing hateful expression presents long-term challenges and may lead companies to restrict such expression even if it is not clearly linked to adverse outcomes (as hateful advocacy is connected to incitement in Article 20(2) of the ICCPR). Companies should articulate the bases for such restrictions, however, and demonstrate the necessity and proportionality of any content actions. (A/HRC/38/35, para. 28)
A majority of the Board found the slur used in this case hateful and dehumanizing. While it did not constitute incitement, the potential for adverse outcomes was nevertheless present. Context is key. The Board welcomes Facebook’s explanation that its designation of this term as a slur followed consultations with local experts and civil society organizations aware of its contextual usage. The majority noted that the post, when read as a whole, made clear the user’s choice of slur was not incidental but central to the user’s argument that the target group was inferior. Moreover, the post in question was widely disseminated at the height of an armed conflict between the user’s State and the State whose nationals the post attacked. The use of dehumanizing language in this context may have online effects, including creating a discriminatory environment that undermines the freedom of others to express themselves. In situations of armed conflict in particular, the risk of hateful, dehumanizing expressions accumulating and spreading on a platform, leading to offline action impacting the right to security of person and potentially life, is especially pronounced. In this particular case, for a majority of the Board, the presence of these risks and Facebook’s human rights responsibility to avoid contributing to them meant it was permitted to remove the slur.
Furthermore, the Board found the removal proportionate. Less severe interventions, such as labels, warning screens, or other measures to reduce dissemination, would not have provided the same protection. Notably, Facebook did not take more severe measures also available to them, such as suspending the user’s account, despite the user seemingly re-posting offending content several times. This illustrates that notwithstanding the removal of this specific piece of content, the user remained free to engage in discussions on the same issues within the boundaries of the Community Standards.
A minority found Facebook’s deletion of the post was not proportionate, on the basis that the risks cited by the majority were too remote, and were not foreseeable. Alternative less-instrusive enforcement options should therefore have been considered. Examples include affixing a warning or sensitivity screen to the content, reducing its virality, promoting counter-messaging, or other techniques. For this minority view, removal of the whole post because it used the slur led to the removal of speech on a matter of public concern, and the necessity and proportionality of that restriction has not been made out.
Another minority view was that the reference to an inanimate object was offensive but not dehumanizing. This view considered that the slur would not contribute to military or other violent action.
9. Oversight Board Decision
9.1 Content Decision
The Board upholds Facebook’s decision to remove the user’s post.
9.2 Policy Advisory Statement
The Board recommends that Facebook:
- Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. Doing so would enable Facebook to encourage expression that complies with its Community Standards rather than adopting an adversarial posture toward users. In this case, the user was informed that the post violated the Community Standard on Hate Speech but was not told that the post violated the standard because it included a slur targeting national origin. Facebook satisfied the principle of legality in this instance, but Facebook’s lack of transparency left its decision susceptible to the mistaken belief that it had removed the post because the user was addressing a controversial subject or expressing a viewpoint Facebook disagreed with.
*Procedural note:
The Oversight Board’s decisions are prepared by panels of five Members and must be agreed by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.