A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board overturns Facebook decision: Case 2021-007-FB-UA


August 2021

The Oversight Board has overturned Facebook’s decision to remove a post in Burmese under its Hate Speech Community Standard. The Board found that the post did not target Chinese people, but the Chinese state. Specifically, it used profanity to reference Chinese governmental policy in Hong Kong as part of a political discussion on the Chinese government’s role in Myanmar.

About the case

In April 2021, a Facebook user who appeared to be in Myanmar posted in Burmese on their timeline. The post discussed ways to limit financing to the Myanmar military following the coup in Myanmar on February 1, 2021. It proposed that tax revenue be given to the Committee Representing Pyidaungsu Hlutaw (CRPH), a group of legislators opposed to the coup. The post received about half a million views and no Facebook users reported it.

Facebook translated the supposedly violating part of the user’s post as “Hong Kong people, because the fucking Chinese tortured them, changed their banking to UK, and now (the Chinese) they cannot touch them.” Facebook removed the post under its Hate Speech Community Standard. This prohibits content targeting a person or group of people based on their race, ethnicity or national origin with “profane terms or phrases with the intent to insult.”

The four content reviewers who examined the post all agreed that it violated Facebook’s rules. In their appeal to the Board, the user stated that they posted the content to “stop the brutal military regime.”

Key findings

This case highlights the importance of considering context when enforcing hate speech policies, as well as the importance of protecting political speech. This is particularly relevant in Myanmar given the February 2021 coup and Facebook’s key role as a communications medium in the country.

The post used the Burmese phrase “$တရုတ်,” which Facebook translated as “fucking Chinese” (or “sout ta-yote”). According to Facebook, the word “ta-yote” “is perceived culturally and linguistically as an overlap of identities/meanings between China the country and the Chinese people.” Facebook stated that given the nature of this word and the fact that the user did not “clearly indicate that the term refers to the country/government of China,” it determined that “the user is, at a minimum, referring to Chinese people.” As such, Facebook removed the post under its Hate Speech Community Standard.

As the same word is used in Burmese to refer to a state and people from that state, context is key to understanding the intended meaning. A number of factors convinced the Board that the user was not targeting Chinese people, but the Chinese state.

The part of the post which supposedly violated Facebook’s rules refers to China’s financial policies in Hong Kong as “torture” or “persecution,” and not the actions of individuals or Chinese people in Myanmar. Both of the Board’s translators indicated that, in this case, the word “ta-yote” referred to a state. When questioned on whether there could be any possible ambiguity in this reference, the translators did not indicate any doubt. The Board’s translators also stated that the post contains terms commonly used by Myanmar’s government and the Chinese embassy to address each other. In addition, while half a million people viewed the post and over 6,000 people shared it, no users reported it. Public comments also described the overall tone of the post as a political discussion.

Given that the post did not target people based on race, ethnicity, or national origin, but was aimed at a state, the Board found it did not violate Facebook’s Hate Speech Community Standard.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

In a policy advisory statement, the Board recommends that Facebook:

  • Ensure that its Internal Implementation Standards are available in the language in which content moderators review content. If necessary to prioritize, Facebook should focus first on contexts where the risks to human rights are more severe.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Attachments

2021-007-FB-UA Public Comments
Download
Back to News & Articles