A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board overturns Facebook decision: Case 2020-002-FB-UA


January 2021

The Oversight Board has overturned Facebook’s decision to remove a post under its Hate Speech Community Standard. The Board found that, while the post might be considered offensive, it did not reach the level of hate speech.

About the case

On October 29, 2020, a user in Myanmar posted in a Facebook group in Burmese. The post included two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in September 2015.

The accompanying text stated that there is something wrong with Muslims (or Muslim men) psychologically or with their mindset. It questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.

Facebook removed this content under its Hate Speech Community Standard.

Key findings

Facebook removed this content as it contained the phrase “[there is] something wrong with Muslims psychologically.” As its Hate Speech Community Standard prohibits generalized statements of inferiority about the mental deficiencies of a group on the basis of their religion, the company removed the post.

The Board considered that while the first part of the post, taken on its own, might appear to make an insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.

While Facebook translated the text as: “[i]t’s indeed something’s wrong with Muslims psychologically,” the Board’s translators suggested: “[t]hose male Muslims have something wrong in their mindset.” They also suggested that the terms used were not derogatory or violent.

The Board’s context experts noted that, while hate speech against Muslim minority groups is common and sometimes severe in Myanmar, statements referring to Muslims as mentally unwell or psychologically unstable are not a strong part of this rhetoric.

Taken in context, the Board believes that the text is better understood as a commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China. That expression of opinion is protected under Facebook’s Community Standards and does not reach the level of hate speech.

Considering international human rights standards on limiting freedom of expression, the Board found that, while the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm. As such, the Board does not consider its removal to be necessary to protect the rights of others.

The Board also stressed that Facebook’s sensitivity to anti-Muslim hate speech was understandable, particularly given the history of violence and discrimination against Muslims in Myanmar and the increased risk ahead of the country’s general election in November 2020. However, for this specific post, the Board concludes that Facebook was incorrect to remove the content.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Update 17:00 GMT, 28.01.21: Following publication of the Myanmar decision, two paragraphs in Section 8 were updated by the Board in relation to physical injury and mental integrity to more precisely capture the Board's conclusions.

Back to news and articles