A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.

Announcing the Board’s next cases and changes to our Bylaws


March 2021

Today, the Board is announcing new cases as well as changes to the Bylaws which govern our work.

Case selectionCase selection

Since we started accepting cases in October 2020, more than 220,000 cases have been appealed to the Board. As we cannot hear every appeal, we are prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.

The cases we are announcing today are:

2021-004-FB-UA

Case referred by user

Submit public comment here.

In January 2021, User A left a comment summarizing their first-hand experience during the recent protests in support of the Russian opposition leader Alexei Navalny. In the comment, User A called User B who criticized the protesters a “common cowardly bot” [банальный трусливый бот].

On January 24, User B commented on a post consisting of several pictures, a video, and text of the protests in support of Alexei Navalny held in Saint Petersburg and across Russia on January 23. User B claimed that they did not know what happened in Saint Petersburg, but that protesters in Moscow were all school children, mentally “slow”, and were “used.” They added that the protesters were not the voice of the people but a “theatre show.” Other users began challenging this critical account of the protesters in the comments thread.

User A challenged User B’s assertion. User A identified as elderly and claimed to have participated in the Saint Petersburg protests with their colleagues and adult children. User A stated that there were so many protesters that they lost their companions in the crowd. They claimed to have witnessed elderly and disabled protesters. They expressed pride in the youth that participated in the protests and dismissed the claim that anyone manipulated them. In their final words, User A called User B a “common cowardly bot” [“банальный трусливый бот”].

User B was the only person who reported the comment. Facebook took it down under its Bullying and Harassment Community Standard, which provides for the removal of content that is meant to degrade or shame private individuals. In certain instances, Facebook requires self-reporting by the person who has been targeted by the bullying or harassment.

As part of their appeal, User A explained to the Board that they had shared first-hand experience from the protests and that they were responding to blatantly false information. User A added that they believed that User B was “a bot that works on order without being an eyewitness and participant in the events.” User A further indicated that their opinion about this user had not changed, that the term “bot” was not a “dirty word”, and that they did not endorse terrorism or any illegal action.

The Board would appreciate public comments that address:

  • Whether Facebook’s decision complied with its Community Standard on Bullying and Harassment.
  • Whether Facebook’s decision and Community Standard on Bullying and Harassment complied with its values and its human rights responsibilities, considering both the importance of free expression and the need to ensure scalable responses to harms caused by bullying and harassment.
  • Contextual information regarding the common use or acceptance of terms equivalent to “банальный трусливый бот” (“common cowardly bot”) when discussing emotive topics on social media, specifically in the Russian language.
  • Any impacts of Facebook’s enforcement of its Community Standards on dissenting or minority political viewpoints in contexts where governments routinely restrict critical expression.
  • Research on dis- and misinformation campaigns against participants in the January protests in Russia, how it is disseminated on social media, including evidence of any coordinated inauthentic behavior and the actors involved.

2021-005-FB-UA

Case referred by user

Submit public comment here.

In December 2020, a Facebook user in the United States posted a comment containing an adaptation of the “two buttons” meme. This meme featured the same split-screen cartoon from the original meme, but with the cartoon character’s face substituted for a Turkish flag. The cartoon character has their right hand on their head and appears to be sweating. Above the cartoon character, in the other half of the split-screen, there are two red buttons with corresponding labels, in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists who deserved it.” The meme was preceded and followed by “thinking face” emoji.

The user’s comment was in response to a post containing an image of a person wearing a niqab with overlay text in English saying: “Not all prisoners are behind bars.” At this point, the Board does not have access to all the intervening comments, and the meme may have been a response to one of those intervening comments.

Facebook removed the post under its Cruel and Insensitive Community Standard after one report from another Facebook user. Under this standard, Facebook removes content that “targets victims of serious physical or emotional harm,” including “explicit attempts to mock victims and mark as cruel implicit attempts, many of which take the form of memes and GIFs.” Subsequently, Facebook reclassified its removal to fall under its Hate Speech Community Standard.

The user states in their appeal to Facebook that “[h]istorical events should not be censored” and that their comment was not meant to offend but to point out “the irony of a particular historical event.” The user also speculates that Facebook misinterpreted their comment as an attack. The user also states that even if the content invokes “religion and war” it is not a “hot button issue.” The user also finds Facebook and its policies overly restrictive and argues that “[h]umor like many things is subjective and something offensive to one person may be funny to another.”

The Board would appreciate public comments that address:

  • Was Facebook’s decision to remove the post consistent with Facebook’s Cruel and Insensitive Community Standard, specifically the rule against explicit and implicit attempts to mock victims?
  • Additionally or alternatively, was Facebook’s decision to remove the post consistent with Facebook’s Hate Speech Community Standard, for example its rule on mocking victims of a hate crime?
  • Whether Facebook’s decision to remove the post is consistent with the company’s stated values and human rights responsibilities.
  • Any specific insight from commenters with knowledge of the social, political and cultural context in Armenia, Turkey and diaspora communities regarding the likely intent and impact of the post.
  • How Facebook can and should take humor and/ or satire into account in enforcing its policies.
  • Research on present-day discourse about the events referred to in the meme, including effects of suppressing this kind of speech, either at the initiative of Facebook or as a consequence of governmental action.

Public commentsPublic comments

If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on these cases, you can submit contributions using the links above.

With the completion of the Board’s first set of cases, we are iterating and improving the public comments process based on the feedback shared by participants. As part of this, we are extending the window for public comment to 14 days for all cases. We hope this will give as many people and organizations as possible the chance to engage with the Board’s work.

In light of these changes, the public comment window for cases announced today will close at 15:00 UTC on Tuesday, March 16, 2021.

Updating our BylawsUpdating our Bylaws

We are also working through a number of changes to our Bylaws to make us more effective as an organization. As such, today we are announcing changes to the timelines for how cases are decided and implemented.

Our Bylaws set a 90-day timeframe for cases to be decided by the Board and implemented by Facebook, which started from Facebook’s last decision on a case. Under the revised Bylaws, this 90-day period starts when the Board assigns a case to panel. This update will help ensure that all cases have the same amount of time for deliberation, no matter when the case was referred to the Board by Facebook or a user.

Further changes will help the Board move quickly in deciding cases referred by Facebook under expedited review, which must be completed within 30 days. For example, Co-Chairs, in consultation with the Board’s Director, can now ensure expedited cases are assigned to panels that are able to deliberate them within the 30-day timeframe. Panels will also be able to shorten the time a user has to submit their statement.

You can read our updated Bylaws in full here.

What’s nextWhat’s next

In the coming weeks, Board Members will be deliberating the cases announced today. Once they have reached their final decisions, we will post them on the Oversight Board website.

Back to news and articles