A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Announcing the Oversight Board’s first case decisions


January 2021

Today, the Oversight Board is announcing its first decisions.

In the five case decisions published today, the Board overturned four of Facebook’s decisions, upheld one and issued nine policy recommendations to the company. The cases covered four continents: Asia, Europe, North America and South America.

None of these cases had easy answers and deliberations revealed the enormous complexity of the issues involved.

In one case, Board Members looked at whether, in the context of an armed conflict, Facebook was right to remove an otherwise-permissible post because it contained a hateful slur. In another, they examined whether a post accused of spreading COVID-19 misinformation contributed to imminent harm. In several cases, Members questioned whether Facebook’s rules were clear enough for users to understand.

These decisions followed a process which was thorough, principled and globally relevant, as outlined in the Board’s Rulebook. The Board’s decisions are binding on Facebook and provide a critical independent check on how the company moderates content.

The outcome of the Board’s decisionsThe outcome of the Board’s decisions

After careful deliberation, the Board has:

Overturned Facebook’s decision on case 2020-002-FB-UA to remove a post under its Community Standard on Hate Speech. The post commented on the supposed lack of reaction to the treatment of Uyghur Muslims in China, compared to the violent reaction to cartoons in France. Click here for more information.

Upheld Facebook’s decision on case 2020-003-FB-UA to remove a post under its Community Standard on Hate Speech. The post used the Russian word “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed have no history compared to Armenians. Click here for more information.

Overturned Facebook’s original decision on case 2020-004-IG-UA to remove a post under its Community Standard on Adult Nudity and Sexual Activity. The post included photos of breast cancer symptoms which, in some cases, showed uncovered female nipples. Click here for more information.

Overturned Facebook’s decision on case 2020-005-FB-UA to remove a post under its Community Standard on Dangerous Individuals and Organizations. The post included an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. Click here for more information.

Overturned Facebook’s decision on case 2020-006-FB-FBR to remove a post under its Community Standard on Violence and Incitement. The post criticized the lack of a health strategy in France and included claims that a cure for COVID-19 exists. Click here for more information.

We have published the full decisions on the Board’s website.

How the Board made these decisionsHow the Board made these decisions

Since we started accepting cases in October 2020, more than 150,000 cases have been appealed to the Board. As we cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies. Each case is assigned to a five-Member panel, which included at least one Member from the region implicated in the content and a mix of gender representation.

For each case, Members decided whether the content violated Facebook’s Community Standards and values. They also considered whether Facebook’s removal of the content respected international human rights standards, including on freedom of expression and other human rights. Members considered factors ranging from the nuances of language, to the user’s intent and the context in which the content was posted.

Members from different countries brought a range of experiences to deliberations. Expertise in areas such as journalism, technology, and human rights deepened their analysis, while differing views ensured a robust debate considering many angles.

After each panel reached a decision, its findings were reviewed and approved by a majority of the Board. This is required for a decision to be issued.

Holding Facebook to accountHolding Facebook to account

Today’s decisions are binding on Facebook and we will hold the company accountable for implementing them.

Facebook now has seven days to restore content in line with the Board’s decisions. The company will also examine whether identical content with parallel context associated with the Board’s decisions should remain on its platform. In addition, Facebook must publicly respond to any policy recommendations the Board has made in its decisions within 30 days.

Public commentsPublic comments

Today, alongside our first decisions, we are also publishing nearly 80 public comments.

These provided valuable insights in areas such as local context and Facebook’s Community Standards, as well as giving feedback on the public comments process itself. They shaped the Board’s thinking and, in one case, a policy recommendation drew upon public comments.

As with all the Board’s early work, we will iterate and improve our public comments process as we receive feedback. In particular, we are considering how to provide more context in our case descriptions and give third parties more time to comment. We encourage people and organizations to register for updates on new cases and contribute to future calls for public comment.

What’s nextWhat’s next

The Board’s first case decisions are another step towards building a strong institution capable of holding Facebook to account over the long-term.

In the coming days we expect to publish a decision on case 2020-007-FB-FBR. This case relates to India and a post removed under Facebook’s Violence and Incitement Community Standard. We will also shortly be announcing the Board’s next set of cases, and opening public comment on the case accepted by the Board last week relating to former US President Trump’s indefinite suspension from Facebook and Instagram.

Recent events in the United States and around the world have highlighted the enormous impact that content decisions taken by internet services have on human rights and free expression. The challenges and limitations of the existing approaches to moderating content draw attention to the value of independent oversight of the most consequential decisions by companies such as Facebook.

We believe the first case decisions by the Oversight Board demonstrate our commitment to holding Facebook to account, by standing up for the interests of users and communities around the world, and by beginning to reshape Facebook’s approach to content moderation. This is the start of a process that will take time, and we look forward to sharing our progress through the Board’s many subsequent case decisions.

Back to news and articles