OVERTURNED
2020-002-FB-UA

Myanmar post about Muslims

The Oversight Board has overturned Facebook's decision to remove a post under its hate speech Community Standard.
OVERTURNED
2020-002-FB-UA

Myanmar post about Muslims

The Oversight Board has overturned Facebook's decision to remove a post under its hate speech Community Standard.
Policies and topics
Politics, Religion, Violence
Hate speech
Region and countries
Europe
Myanmar, France, China
Platform
Facebook
Policies and topics
Politics, Religion, Violence
Hate speech
Region and countries
Europe
Myanmar, France, China
Platform
Facebook

To read this decision in Burmese click here.

ဆုံးဖြတ်ချက် အပြည့်အစုံကို ဗမာဘာသာဖြ ဖြင့် ဖတ်ရှူရန်၊ ဤနေရာကို နှိပ်ပါ-

Case SummaryCase Summary

The Oversight Board has overturned Facebook’s decision to remove a post under its Hate Speech Community Standard. The Board found that, while the post might be considered offensive, it did not reach the level of hate speech.

About the case

On October 29, 2020, a user in Myanmar posted in a Facebook group in Burmese. The post included two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in September 2015.

The accompanying text stated that there is something wrong with Muslims (or Muslim men) psychologically or with their mindset. It questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.

Facebook removed this content under its Hate Speech Community Standard.

Key findings

Facebook removed this content as it contained the phrase “[there is] something wrong with Muslims psychologically.” As its Hate Speech Community Standard prohibits generalized statements of inferiority about the mental deficiencies of a group on the basis of their religion, the company removed the post.

The Board considered that while the first part of the post, taken on its own, might appear to make an insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.

While Facebook translated the text as: “[i]t’s indeed something’s wrong with Muslims psychologically,” the Board’s translators suggested: “[t]hose male Muslims have something wrong in their mindset.” They also suggested that the terms used were not derogatory or violent.

The Board’s context experts noted that, while hate speech against Muslim minority groups is common and sometimes severe in Myanmar, statements referring to Muslims as mentally unwell or psychologically unstable are not a strong part of this rhetoric.

Taken in context, the Board believes that the text is better understood as a commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China. That expression of opinion is protected under Facebook’s Community Standards and does not reach the level of hate speech.

Considering international human rights standards on limiting freedom of expression, the Board found that, while the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm. As such, the Board does not consider its removal to be necessary to protect the rights of others.

The Board also stressed that Facebook’s sensitivity to anti-Muslim hate speech was understandable, particularly given the history of violence and discrimination against Muslims in Myanmar and the increased risk ahead of the country’s general election in November 2020. However, for this specific post, the Board concludes that Facebook was incorrect to remove the content.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.

*Case summaries provide an overview of the case and do not have precedential value.

Full Case DecisionFull Case Decision

1.Decision Summary

The Oversight Board has overturned Facebook’s decision to remove content it considered hate speech. The Board concludes that Facebook categorized a post as hate speech when it did not rise to that level.

2. Case Description

On October 29, 2020, a Facebook user in Myanmar posted in Burmese to a group which describes itself as a forum for intellectual discussion. The post includes two widely-shared photographs of a Syrian toddler of Kurdish ethnicity who drowned in the Mediterranean Sea in September 2015. The accompanying text begins by stating that there is something wrong with Muslims (or Muslim men) psychologically or with their mindset. It questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.

Facebook translated the statement “[there is] something wrong with Muslims psychologically” to constitute ‘Tier 2’ Hate Speech under its Community Standards. As this prohibits generalized statements of inferiority about the mental deficiencies of a person or group of people on the basis of their religion, Facebook removed the content.

Prior to removal, on November 3, 2020, the two photographs included in the post had warning screens placed on them under the Violent and Graphic Content Community Standard. According to Facebook, nothing else in the post violated its policies. The user appealed to the Oversight Board, arguing they had not used hate speech.

3. Authority and Scope

The Oversight Board has the authority to review Facebook’s decision under the Board’s Charter Article 2.1 and may uphold or overturn that decision under Article 3.5. This post is within the Oversight Board’s scope of review: it does not fit within any excluded category of content set forth in Article 2, Section 1.2.1 of the Board’s Bylaws and it does not conflict with Facebook’s legal obligations under Article 2, Section 1.2.2 of the Bylaws.

4. Relevant Standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The Community Standard on Hate Speech states that Facebook does “not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.” Facebook defines hate speech as an attack based on protected characteristics. Attacks may be “violent or dehumanizing speech, harmful stereotypes, statements of inferiority, or calls for exclusion or segregation” and are separated into three tiers of prohibited content. Under Tier 2, prohibited content includes:

generalizations that state inferiority (in written or visual form) in the following ways [...] mental deficiencies are defined as those about: intellectual capacity [...] education [...] mental health.

Protected characteristics are “race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability” with some protection for age and immigration status.

II. Facebook’s Values

The introduction to the Community Standards notes that “Voice” is Facebook’s paramount value, but the platform may limit “Voice” in service of several other values including “Safety.” Facebook’s definition of “Safety” states: “We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.”

III. Relevant Human Rights Standards Considered by the Board

The UN Guiding Principles on Business and Human Rights ( UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. Drawing upon the UNGPs, the following international human rights standards were considered in this case:

  • The right to freedom of expression: International Covenant on Civil and Political Rights ( ICCPR), Article 19; General Comment No. 34, Human Rights Committee (2011) ( General Comment 34); Rabat Plan of Action; UN Special Rapporteur on Freedom of Expression Report on Online Hate Speech (2019) ( A/74/486).
  • The right to non-discrimination: ICCPR Articles 2 and 26.
  • The right to life and security: ICCPR Articles 6 and 9.

5. User Statement

The user submitted their appeal against Facebook’s decision to remove the content in November 2020. The user stated that their post did not violate Facebook’s Community Standards and that they did not use hate speech. The user explained that their post was sarcastic and meant to compare extremist religious responses in different countries. The user also stated that Facebook is not able to distinguish between sarcasm and serious discussion in the Burmese language and context.

6. Explanation of Facebook’s Decision

Facebook removed this content based on its Hate Speech Community Standard. Facebook considered this post a Tier 2 attack under that standard, as a generalization of mental deficiency regarding Muslims. According to information provided by Facebook, which is not currently in the public domain, generalizations “are unqualified negative statements, with no room for reason, factual accuracy, or argument and they infringe on the rights and reputations of others.” Facebook stated that the only component of the post that violated Community Standards was the statement that something is wrong with Muslims psychologically.

Facebook also argued that its Hate Speech Community Standard aligns with international human rights standards. According to Facebook, although prohibited speech under this standard may not rise to “advocacy or incitement to violence,” such expression can be restricted as it has the “capacity to trigger acts of discrimination, violence, or hatred, particularly if distributed widely, virally, or in contexts with severe human rights risks.” As the context for this case, Facebook cited the recent attack in Nice, France which left three people dead, the ongoing detention of Uyghur Muslims in China, the Syrian refugee crisis, and anti-Muslim violence in general.

7. Third party submissions

The Board received 11 public comments related to this case. While one comment contained no content, 10 comments provided substantive submissions on this case. The regional breakdown of the comments was: one from Asia Pacific and Oceania, four from Europe and five from the United States and Canada. The submissions covered various themes, for example, whether the provocative and objectionable post constitutes with sufficient clarity hate speech; whether the content was an attack against Muslims; whether the user’s intent was to shed light on the treatment of Uyghur Muslims in China and the Syrian refugee crisis; whether the user’s intent was to condemn rather than promote the death of individuals; whether the use of retaliation in the post could imply direct call for physical violence against Chinese nationals; as well as feedback for improving the Board’s public comment process.

8. Oversight Board Analysis

8.1 Compliance with Community Standards

The post does not constitute Hate Speech within the meaning of the relevant Community Standard.

In this case, Facebook indicated that the speech in question was a Tier 2 attack under the Hate Speech Community Standard. The protected characteristic was religious affiliation, as the content described Muslims or Muslim men. According to Facebook, the attack was a “generalization that state[s] inferiority” about “mental deficiencies.” This section prohibits attacks about “[m]ental health, including but not limited to: mentally ill, retarded, crazy, insane” and “[i]ntellectual capacity, including, but not limited to: dumb, stupid, idiots.”

Although the first sentence of the post, taken on its own, might appear to be making an offensive and insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.

Human rights organizations and other experts have indicated that hate speech against Muslim minority groups in Myanmar is common and sometimes severe, in particular around the general election on November 8, 2020 ( FORUM-ASIA briefing paper on pervasive hate speech and the role of Facebook in Myanmar, pages 5 - 8, Report of the UN independent international fact-finding mission on Myanmar, A/HRC/42/50, paras 1303, 1312, 1315 and 1317). However, there was no indication that statements referring to Muslims as mentally unwell or psychologically unstable are a significant part of anti-Muslim rhetoric in Myanmar. Further, while Facebook translated the sentence as “[i]t’s indeed something’s wrong with Muslims psychologically,” the Board’s translators found it stated “[t]hose male Muslims have something wrong in their mindset.” The translators also suggested that while the terms used could show intolerance, they were not derogatory or violent.

The post is thus better read, in light of context, as a commentary pointing to the apparent inconsistency between Muslims’ reactions to events in France and in China. That expression of opinion is protected under the Community Standards, and does not reach the level of hate speech that would justify removal.

8.2 Compliance with Facebook Values

Facebook’s decision to remove the content does not comply with the company’s values. Although Facebook’s value of “Safety” is important, particularly in Myanmar given the context of discrimination and violence against Muslims, this content did not pose a risk to “Safety” that would justify displacing “Voice.”

8.3 Compliance with Human Rights Standards

Restoring the post is consistent with international human rights standards.

According to Article 19 of the ICCPR individuals have the right to seek and receive information, including controversial and deeply offensive information (General Comment No. 34). Some Board Members noted the UN Special Rapporteur on Freedom of Expression’s 2019 report on online hate speech that affirms that international human rights law “protects the rights to offend and mock” (para. 17). Some Board Members expressed concerns that commentary on the situation of Uyghur Muslims may be suppressed or under-reported in countries with close ties to China.

At the same time, the Board recognizes that the right to freedom of expression is not absolute and can be subject to limitations under international human rights law.

First, the Board assessed whether the content was subject to a mandatory restriction under international human rights law. The Board found that the content was not advocacy of religious hatred constituting incitement to discrimination, hostility or violence, which states are required to prohibit under ICCPR Article 20, para. 2. The Board considered the factors cited in the UN Rabat Plan of Action, including the context, the content of the post, and the likelihood of harm. While the post had a pejorative tone, the Board did not consider that it advocated hatred, and did not consider that it intentionally incited any form of imminent harm.

The Board also discussed whether this content could be restricted under ICCPR Article 19, para. 3. This provision of international human rights law requires restrictions on expression to be defined and easily understood (legality requirement), to have the purpose of advancing one of several listed objectives (legitimate aim requirement), and to be necessary and narrowly tailored to the specific objective (necessity and proportionality requirement).

The Board recognizes that Facebook was pursuing a legitimate aim through the restriction: to protect the rights of others to life, to security of person, physical or mental injury, and to protection from discrimination. The Board acknowledges that online hate speech in Myanmar has been linked to serious offline harm, including accusations of potential crimes against humanity and genocide. As such, the Board recognized the importance of protecting the rights of those who may be subject to discrimination and violence, and who may even be at risk of atrocities.

Nonetheless, the Board concludes while some may consider the post offensive and insulting towards Muslims, the Board does not consider its removal to be necessary to protect the rights of others.

The Board recognizes that online hate speech is a complex issue to moderate and linguistic and cultural features such as sarcasm make it more difficult. In this case, there were no indications that the post contained threats against identifiable individuals.

The Board acknowledges it is difficult for Facebook to evaluate the intent behind individual posts when moderating content at scale and in real time. While not decisive, the Board considered the user’s claim in their appeal that they are opposed to all forms of religious extremism. The fact that the post was within a group that claimed to be for intellectual and philosophical discussion, and also drew attention to discrimination against Uyghur Muslims in China, lends support to the user’s claim. At the same time, some Board Members found the user’s references to the refugee child who had died to be insensitive.

The Board emphasizes that restoring any particular post does not imply any agreement with its content. Even in circumstances where discussion of religion or identity is sensitive and may cause offense, open discussion remains important. Removing this content is unlikely to reduce tensions or protect persons from discrimination. There are more effective ways to encourage understanding between different groups.

The Board also emphasizes that Facebook’s sensitivity to the possibility of anti-Muslim hate speech in Myanmar is understandable, given the history of violence and discrimination against Muslims in that country, the context of increased risk around the elections, and the limited information available at the time. In these circumstances, Facebook’s caution demonstrated a general recognition of the company’s human rights responsibilities. Nonetheless, for this specific piece of content, the Board concludes that Facebook was incorrect to remove the content.

9. Oversight Board Decision

9.1 Content Decision

The Oversight Board overturns Facebook’s decision to take down the content, requiring the post to be restored.

The Board understands the photos will again have a warning screen under the Violent and Graphic Content Community Standard.

*Procedural note:

The Oversight Board’s decisions are prepared by panels of five Members and must be agreed by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies, LLC whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.

Policies and topics
Politics, Religion, Violence
Hate speech
Region and countries
Europe
Myanmar, France, China
Platform
Facebook
Policies and topics
Politics, Religion, Violence
Hate speech
Region and countries
Europe
Myanmar, France, China
Platform
Facebook

To read this decision in Burmese click here.

ဆုံးဖြတ်ချက် အပြည့်အစုံကို ဗမာဘာသာဖြ ဖြင့် ဖတ်ရှူရန်၊ ဤနေရာကို နှိပ်ပါ-

Case SummaryCase Summary

The Oversight Board has overturned Facebook’s decision to remove a post under its Hate Speech Community Standard. The Board found that, while the post might be considered offensive, it did not reach the level of hate speech.

About the case

On October 29, 2020, a user in Myanmar posted in a Facebook group in Burmese. The post included two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in September 2015.

The accompanying text stated that there is something wrong with Muslims (or Muslim men) psychologically or with their mindset. It questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.

Facebook removed this content under its Hate Speech Community Standard.

Key findings

Facebook removed this content as it contained the phrase “[there is] something wrong with Muslims psychologically.” As its Hate Speech Community Standard prohibits generalized statements of inferiority about the mental deficiencies of a group on the basis of their religion, the company removed the post.

The Board considered that while the first part of the post, taken on its own, might appear to make an insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.

While Facebook translated the text as: “[i]t’s indeed something’s wrong with Muslims psychologically,” the Board’s translators suggested: “[t]hose male Muslims have something wrong in their mindset.” They also suggested that the terms used were not derogatory or violent.

The Board’s context experts noted that, while hate speech against Muslim minority groups is common and sometimes severe in Myanmar, statements referring to Muslims as mentally unwell or psychologically unstable are not a strong part of this rhetoric.

Taken in context, the Board believes that the text is better understood as a commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China. That expression of opinion is protected under Facebook’s Community Standards and does not reach the level of hate speech.

Considering international human rights standards on limiting freedom of expression, the Board found that, while the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm. As such, the Board does not consider its removal to be necessary to protect the rights of others.

The Board also stressed that Facebook’s sensitivity to anti-Muslim hate speech was understandable, particularly given the history of violence and discrimination against Muslims in Myanmar and the increased risk ahead of the country’s general election in November 2020. However, for this specific post, the Board concludes that Facebook was incorrect to remove the content.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.

*Case summaries provide an overview of the case and do not have precedential value.

Full Case DecisionFull Case Decision

1.Decision Summary

The Oversight Board has overturned Facebook’s decision to remove content it considered hate speech. The Board concludes that Facebook categorized a post as hate speech when it did not rise to that level.

2. Case Description

On October 29, 2020, a Facebook user in Myanmar posted in Burmese to a group which describes itself as a forum for intellectual discussion. The post includes two widely-shared photographs of a Syrian toddler of Kurdish ethnicity who drowned in the Mediterranean Sea in September 2015. The accompanying text begins by stating that there is something wrong with Muslims (or Muslim men) psychologically or with their mindset. It questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.

Facebook translated the statement “[there is] something wrong with Muslims psychologically” to constitute ‘Tier 2’ Hate Speech under its Community Standards. As this prohibits generalized statements of inferiority about the mental deficiencies of a person or group of people on the basis of their religion, Facebook removed the content.

Prior to removal, on November 3, 2020, the two photographs included in the post had warning screens placed on them under the Violent and Graphic Content Community Standard. According to Facebook, nothing else in the post violated its policies. The user appealed to the Oversight Board, arguing they had not used hate speech.

3. Authority and Scope

The Oversight Board has the authority to review Facebook’s decision under the Board’s Charter Article 2.1 and may uphold or overturn that decision under Article 3.5. This post is within the Oversight Board’s scope of review: it does not fit within any excluded category of content set forth in Article 2, Section 1.2.1 of the Board’s Bylaws and it does not conflict with Facebook’s legal obligations under Article 2, Section 1.2.2 of the Bylaws.

4. Relevant Standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The Community Standard on Hate Speech states that Facebook does “not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.” Facebook defines hate speech as an attack based on protected characteristics. Attacks may be “violent or dehumanizing speech, harmful stereotypes, statements of inferiority, or calls for exclusion or segregation” and are separated into three tiers of prohibited content. Under Tier 2, prohibited content includes:

generalizations that state inferiority (in written or visual form) in the following ways [...] mental deficiencies are defined as those about: intellectual capacity [...] education [...] mental health.

Protected characteristics are “race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability” with some protection for age and immigration status.

II. Facebook’s Values

The introduction to the Community Standards notes that “Voice” is Facebook’s paramount value, but the platform may limit “Voice” in service of several other values including “Safety.” Facebook’s definition of “Safety” states: “We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.”

III. Relevant Human Rights Standards Considered by the Board

The UN Guiding Principles on Business and Human Rights ( UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. Drawing upon the UNGPs, the following international human rights standards were considered in this case:

  • The right to freedom of expression: International Covenant on Civil and Political Rights ( ICCPR), Article 19; General Comment No. 34, Human Rights Committee (2011) ( General Comment 34); Rabat Plan of Action; UN Special Rapporteur on Freedom of Expression Report on Online Hate Speech (2019) ( A/74/486).
  • The right to non-discrimination: ICCPR Articles 2 and 26.
  • The right to life and security: ICCPR Articles 6 and 9.

5. User Statement

The user submitted their appeal against Facebook’s decision to remove the content in November 2020. The user stated that their post did not violate Facebook’s Community Standards and that they did not use hate speech. The user explained that their post was sarcastic and meant to compare extremist religious responses in different countries. The user also stated that Facebook is not able to distinguish between sarcasm and serious discussion in the Burmese language and context.

6. Explanation of Facebook’s Decision

Facebook removed this content based on its Hate Speech Community Standard. Facebook considered this post a Tier 2 attack under that standard, as a generalization of mental deficiency regarding Muslims. According to information provided by Facebook, which is not currently in the public domain, generalizations “are unqualified negative statements, with no room for reason, factual accuracy, or argument and they infringe on the rights and reputations of others.” Facebook stated that the only component of the post that violated Community Standards was the statement that something is wrong with Muslims psychologically.

Facebook also argued that its Hate Speech Community Standard aligns with international human rights standards. According to Facebook, although prohibited speech under this standard may not rise to “advocacy or incitement to violence,” such expression can be restricted as it has the “capacity to trigger acts of discrimination, violence, or hatred, particularly if distributed widely, virally, or in contexts with severe human rights risks.” As the context for this case, Facebook cited the recent attack in Nice, France which left three people dead, the ongoing detention of Uyghur Muslims in China, the Syrian refugee crisis, and anti-Muslim violence in general.

7. Third party submissions

The Board received 11 public comments related to this case. While one comment contained no content, 10 comments provided substantive submissions on this case. The regional breakdown of the comments was: one from Asia Pacific and Oceania, four from Europe and five from the United States and Canada. The submissions covered various themes, for example, whether the provocative and objectionable post constitutes with sufficient clarity hate speech; whether the content was an attack against Muslims; whether the user’s intent was to shed light on the treatment of Uyghur Muslims in China and the Syrian refugee crisis; whether the user’s intent was to condemn rather than promote the death of individuals; whether the use of retaliation in the post could imply direct call for physical violence against Chinese nationals; as well as feedback for improving the Board’s public comment process.

8. Oversight Board Analysis

8.1 Compliance with Community Standards

The post does not constitute Hate Speech within the meaning of the relevant Community Standard.

In this case, Facebook indicated that the speech in question was a Tier 2 attack under the Hate Speech Community Standard. The protected characteristic was religious affiliation, as the content described Muslims or Muslim men. According to Facebook, the attack was a “generalization that state[s] inferiority” about “mental deficiencies.” This section prohibits attacks about “[m]ental health, including but not limited to: mentally ill, retarded, crazy, insane” and “[i]ntellectual capacity, including, but not limited to: dumb, stupid, idiots.”

Although the first sentence of the post, taken on its own, might appear to be making an offensive and insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.

Human rights organizations and other experts have indicated that hate speech against Muslim minority groups in Myanmar is common and sometimes severe, in particular around the general election on November 8, 2020 ( FORUM-ASIA briefing paper on pervasive hate speech and the role of Facebook in Myanmar, pages 5 - 8, Report of the UN independent international fact-finding mission on Myanmar, A/HRC/42/50, paras 1303, 1312, 1315 and 1317). However, there was no indication that statements referring to Muslims as mentally unwell or psychologically unstable are a significant part of anti-Muslim rhetoric in Myanmar. Further, while Facebook translated the sentence as “[i]t’s indeed something’s wrong with Muslims psychologically,” the Board’s translators found it stated “[t]hose male Muslims have something wrong in their mindset.” The translators also suggested that while the terms used could show intolerance, they were not derogatory or violent.

The post is thus better read, in light of context, as a commentary pointing to the apparent inconsistency between Muslims’ reactions to events in France and in China. That expression of opinion is protected under the Community Standards, and does not reach the level of hate speech that would justify removal.

8.2 Compliance with Facebook Values

Facebook’s decision to remove the content does not comply with the company’s values. Although Facebook’s value of “Safety” is important, particularly in Myanmar given the context of discrimination and violence against Muslims, this content did not pose a risk to “Safety” that would justify displacing “Voice.”

8.3 Compliance with Human Rights Standards

Restoring the post is consistent with international human rights standards.

According to Article 19 of the ICCPR individuals have the right to seek and receive information, including controversial and deeply offensive information (General Comment No. 34). Some Board Members noted the UN Special Rapporteur on Freedom of Expression’s 2019 report on online hate speech that affirms that international human rights law “protects the rights to offend and mock” (para. 17). Some Board Members expressed concerns that commentary on the situation of Uyghur Muslims may be suppressed or under-reported in countries with close ties to China.

At the same time, the Board recognizes that the right to freedom of expression is not absolute and can be subject to limitations under international human rights law.

First, the Board assessed whether the content was subject to a mandatory restriction under international human rights law. The Board found that the content was not advocacy of religious hatred constituting incitement to discrimination, hostility or violence, which states are required to prohibit under ICCPR Article 20, para. 2. The Board considered the factors cited in the UN Rabat Plan of Action, including the context, the content of the post, and the likelihood of harm. While the post had a pejorative tone, the Board did not consider that it advocated hatred, and did not consider that it intentionally incited any form of imminent harm.

The Board also discussed whether this content could be restricted under ICCPR Article 19, para. 3. This provision of international human rights law requires restrictions on expression to be defined and easily understood (legality requirement), to have the purpose of advancing one of several listed objectives (legitimate aim requirement), and to be necessary and narrowly tailored to the specific objective (necessity and proportionality requirement).

The Board recognizes that Facebook was pursuing a legitimate aim through the restriction: to protect the rights of others to life, to security of person, physical or mental injury, and to protection from discrimination. The Board acknowledges that online hate speech in Myanmar has been linked to serious offline harm, including accusations of potential crimes against humanity and genocide. As such, the Board recognized the importance of protecting the rights of those who may be subject to discrimination and violence, and who may even be at risk of atrocities.

Nonetheless, the Board concludes while some may consider the post offensive and insulting towards Muslims, the Board does not consider its removal to be necessary to protect the rights of others.

The Board recognizes that online hate speech is a complex issue to moderate and linguistic and cultural features such as sarcasm make it more difficult. In this case, there were no indications that the post contained threats against identifiable individuals.

The Board acknowledges it is difficult for Facebook to evaluate the intent behind individual posts when moderating content at scale and in real time. While not decisive, the Board considered the user’s claim in their appeal that they are opposed to all forms of religious extremism. The fact that the post was within a group that claimed to be for intellectual and philosophical discussion, and also drew attention to discrimination against Uyghur Muslims in China, lends support to the user’s claim. At the same time, some Board Members found the user’s references to the refugee child who had died to be insensitive.

The Board emphasizes that restoring any particular post does not imply any agreement with its content. Even in circumstances where discussion of religion or identity is sensitive and may cause offense, open discussion remains important. Removing this content is unlikely to reduce tensions or protect persons from discrimination. There are more effective ways to encourage understanding between different groups.

The Board also emphasizes that Facebook’s sensitivity to the possibility of anti-Muslim hate speech in Myanmar is understandable, given the history of violence and discrimination against Muslims in that country, the context of increased risk around the elections, and the limited information available at the time. In these circumstances, Facebook’s caution demonstrated a general recognition of the company’s human rights responsibilities. Nonetheless, for this specific piece of content, the Board concludes that Facebook was incorrect to remove the content.

9. Oversight Board Decision

9.1 Content Decision

The Oversight Board overturns Facebook’s decision to take down the content, requiring the post to be restored.

The Board understands the photos will again have a warning screen under the Violent and Graphic Content Community Standard.

*Procedural note:

The Oversight Board’s decisions are prepared by panels of five Members and must be agreed by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies, LLC whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.