A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board overturns Facebook decision: case 2021-004-FB-UA


May 2021

The Oversight Board has overturned Facebook’s decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook removed the comment for using the word “cowardly” which was construed as a negative character claim.

The Board found that while the removal was in line with the Bullying and Harassment Community Standard, the current Standard was an unnecessary and disproportionate restriction on free expression under international human rights standards. It was also not in line with Facebook’s values.

About the case

On January 24, a user in Russia made a post consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23. Another user (the Protest Critic) responded to the root post and wrote that while they did not know what happened in Saint Petersburg, the protesters in Moscow were all school children, mentally “slow,” and were “shamelessly used.”

Other users then challenged the Protest Critic in subsequent comments to the root post. A user who was at the protest (the Protester) appeared to be the last to respond to the Protest Critic. They claimed to be elderly and to have participated in the protest in Saint Petersburg. The Protester ended the comment by calling the Protest Critic a “cowardly bot.”

The Protest Critic then reported the Protester’s comment to Facebook for bullying and harassment. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” and, since the “target” of the attack reported the content, Facebook removed it. The Protester appealed against this decision to Facebook. Facebook determined that the comment violated the Bullying and Harassment policy, under which a private individual can get Facebook to take down posts containing a negative comment on their character.

Key findings

This case highlights the tension between policies protecting people against bullying and harassment and the need to protect freedom of expression. This is especially relevant in the context of political protest in a country where there are credible complaints about the absence of effective mechanisms to protect human rights.

The Board found that, while Facebook’s removal of the content may have been consistent with a strict application of the Community Standards, the Community Standards fail to consider the wider context and disproportionately restricted freedom of expression.

The Community Standard on Bullying and Harassment states that Facebook removes negative character claims about a private individual when the target reports the content. The Board does not challenge Facebook’s conclusion that the Protest Critic is a private individual and that the term “cowardly” was a negative character claim.

However, the Community Standard did not require Facebook to consider the political context, the public character, or the heated tone of the conversation. Accordingly, Facebook did not consider the Protester’s intent to refute false claims about the protests or attempt to balance that concern against the reported negative character claim.

The decision to remove this content failed to balance Facebook’s values of “Dignity” and “Safety” against “Voice.” Political speech is central to the value of “Voice” and should only be limited where there are clear “Safety” or “Dignity” concerns.

"Voice” is also particularly important in countries where freedom of expression is routinely suppressed, as in Russia. In this case, the Board found that Facebook was aware of the wider context of pro-Navalny protests in Russia, and heightened caution should have led to a more careful assessment of content.

The Board found that Facebook’s Community Standard on Bullying and Harassment has a legitimate aim in protecting the rights of others. However, in this case, combining the distinct concepts of bullying and harassment into a single set of rules, which were not clearly defined, led to the unnecessary removal of legitimate speech.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

In a policy advisory statement, the Board recommends that, to comply with international human rights standards, Facebook should amend and redraft its Bullying and Harassment Community Standard to:

  • Explain the relationship between its Bullying and Harassment policy rationale and the “Do nots” as well as the other rules restricting content that follow it.
  • Differentiate between bullying and harassment and provide definitions that distinguish the two acts. The Community Standard should also clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.
  • Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.
  • Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.
  • When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.
  • Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Back to news and articles