A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board demands more transparency from Facebook


October 2021

Over the last several weeks, media reporting has drawn renewed attention to the seemingly inconsistent way in which Facebook makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users.

Since publishing its first decisions in January, the Oversight Board has pushed Facebook to reveal more information about how it works and to treat its users fairly. So far, we have taken on 20 important cases and issued 17 decisions covering topics from hate speech to COVID-19 misinformation. We’ve received around 10,000 public comments from people across the world and made more than 75 recommendations to Facebook.

As part of our commitment to transparency, today we are publishing our first quarterly transparency reports.

These cover the fourth quarter of 2020, as well as the first and second quarters of 2021. They provide new details on the cases we are receiving from users, the decisions we have taken and our recommendations to Facebook. Moving forward, we will publish a transparency report as soon as possible after each quarter ends. We will also be publishing annual reports, which will provide a more detailed qualitative assessment of how Facebook is implementing the Board’s decisions and recommendations.

Today’s reports conclude that Facebook has not been fully forthcoming with the Board on its ‘cross-check’ system, which the company uses to review content decisions relating to high-profile users. The Board has also announced that it has accepted a request from Facebook, in the form of a policy advisory opinion, to review its cross-check system and make recommendations on how it can be changed. As part of this review, Facebook has agreed to share with the Board documents concerning the cross-check system as reported in the Wall Street Journal.

Six highlights from the Board’s quarterly reportsSix highlights from the Board’s quarterly reports

1.Over half a million user appeals submitted to the Board

Between October 2020 and the end of June 2021, Facebook and Instagram users submitted around 524,000 cases to the Board. User appeals increased in each quarter, with around 114,000 cases in the fourth quarter of 2020, 203,000 cases in the first quarter of 2021, and around 207,000 cases in the second quarter of 2021.

2. Two thirds of appeals where users wanted their content restored related to hate speech or bullying

Up to the end of June 2021, we estimate that more than a third of cases submitted to the Board (36%) related to content concerning Facebook’s rules on Hate Speech. We estimate that Bullying and Harassment made up another third (31%) of cases submitted, with Violence and Incitement (13%), Adult Nudity and Sexual Activity (9%) and Dangerous Individuals and Organizations (6%) making up most of the remaining cases. These figures do not include user appeals to remove content from Facebook, which were introduced beginning in mid-April.

3. Nearly half of user appeals came from the United States and Canada

Up to the end of June, we estimate that nearly half of cases submitted (46%) came from the US and Canada, while 22% of cases came from Europe and 16% from Latin America and the Caribbean. We estimate that 8% of cases came from the Asia Pacific Oceania region, 4% came from the Middle East and North Africa, 2% came from Central and South Asia and 2% came from Sub-Saharan Africa.

We do not believe this represents the actual distribution of Facebook content issues around the globe. If anything, we have reason to believe that users in Asia, Sub-Saharan Africa, and the Middle East experience more, not fewer, problems with Facebook than parts of the world with more appeals.

We are expanding our outreach in these areas to ensure that Board oversight extends to users everywhere, and we ask that users and civil society organizations in Asia, Sub-Saharan Africa and the Middle East take notice of our concern and bring appeals when they suffer the effects of poor content moderation by Facebook in their areas.

4. The Board’s wider processes prompted Facebook to restore more than 30 pieces of content covering significant cases

As part of the Board’s process for shortlisting cases, we ask Facebook to confirm that cases are eligible for review under the Bylaws. As a result of this process, by the end of June Facebook identified 38 shortlisted cases where its original decision on the piece of content was incorrect.

In 35 of these cases, Facebook then took action on the content, while in three cases it could not do so as the content had been deleted by the user.

Nearly half of the cases where Facebook identified its original decision as incorrect related to the Hate Speech Community Standard, while nearly a third related to Dangerous Individuals and Organizations.

5. Facebook is answering most of the Board’s questions, but not all of them

To assist with making our decisions and to push Facebook to be as transparent as possible, we send questions to Facebook about specific cases. Of the 156 questions sent to Facebook about decisions we published through the end of June, Facebook answered 130, partially answered 12 and declined to answer 14.

By asking specific questions and including the details in our final decision, we hope to provide users and researchers with as much information as possible about how the company works.

In our reports, we give examples of the types of questions which Facebook declined to answer, and the company’s reasons for doing so. For example, in several instances, Facebook declined to answer questions about the user’s previous behavior on Facebook, which the company claimed was irrelevant to the Board’s determination about the case in hand.

6. Facebook was not fully forthcoming with the Board on cross-check

Following recent disclosures in the Wall Street Journal, the Board committed to look at whether Facebook had been forthcoming in its responses on its cross-check system, which the company uses to review content decisions relating to high-profile users.

In the Board’s view, the team within Facebook tasked with providing information has not been fully forthcoming on cross-check. On some occasions, Facebook failed to provide relevant information to the Board, while in other instances, the information it did provide was incomplete.

When Facebook referred the case related to former US President Trump to the Board, it did not mention the cross-check system. Given that the referral included a specific policy question about account-level enforcement for political leaders, many of whom the Board believes were covered by cross-check, this omission is not acceptable. Facebook only mentioned cross-check to the Board when we asked whether Mr. Trump’s page or account had been subject to ordinary content moderation processes.

In its subsequent briefing to the Board, Facebook admitted it should not have said that cross-check only applied to a “small number of decisions.” Facebook noted that for teams operating at the scale of millions of content decisions a day, the numbers involved with cross-check seem relatively small, but recognized its phrasing could come across as misleading.

We also noted that Facebook’s response to our recommendation to “clearly explain the rationale, standards and processes of [cross-check] review, including the criteria to determine which pages and accounts are selected for inclusion” provided no meaningful transparency on the criteria for accounts or pages being selected for inclusion in cross-check.

The credibility of the Oversight Board, our working relationship with Facebook, and our ability to render sound judgments on cases all depend on being able to trust that information provided to us by Facebook is accurate, comprehensive, and paints a full picture of the topic at hand. We will continue to track and report on information provided by Facebook to ensure it is as comprehensive and complete as possible.

Today, the Board has also announced it has accepted a request from Facebook, in the form of a policy advisory opinion, to review the company’s cross-check system and make recommendations on how it can be changed.

Specifically, Facebook requested guidance on, among other things: how to ensure fairness and objectivity in cross-check reviews, taking into account context; how to govern cross-check and promote transparency; and the criteria it uses to determine who is included in cross-check and how to ensure this is equitable.

Now that we have accepted Facebook’s request, the Board will engage with civil society globally, including academics and researchers, as we scrutinize this critical issue. This will include a call for public comments which we will launch in the coming days. The Board continues to reach out to a broad range of voices to inform its work, including former Facebook employees who have come forward in recent months.

Facebook has now agreed to share with the Board documents concerning cross-check as reported on in the Wall Street Journal. The Board will review these as we produce our policy advisory opinion.

Facebook has also agreed that, from now on, it commits to provide information about the wider context which may be relevant to the Board’s case decisions. This should give a fuller understanding of the work Facebook has already done on a given topic. We will include analysis on whether Facebook is fulfilling this commitment in our future transparency reporting.

Once the Board has deliberated on this policy advisory opinion, and voted to approve it, we will issue our recommendations to Facebook. Facebook must then respond within 30 days.

Users left guessing on Facebook’s rulesUsers left guessing on Facebook’s rules

In many of the decisions covered in today’s reports, and in those we’ve published since, a clear theme has emerged: Facebook isn’t being clear with the people who use its platforms. We’ve consistently seen users left guessing about why Facebook removed their content.

In one of our first cases, Facebook removed a user’s content but didn’t tell them which Community Standard they had broken. In another, Facebook didn’t review a user’s appeal before they came to the Board. In another case, Facebook said a user broke its rules on dangerous individuals, but then found it had lost a policy exception for three years which clearly allowed that user’s post.

Having received over half a million appeals up until the end of June, we know these cases are just the tip of the iceberg. Right now, it’s clear that by not being transparent with users, Facebook is not treating them fairly.

Many can relate to the experience of having their content removed with little explanation of what they did wrong. The Board is deeply concerned with the impact on users and the implications for freedom of expression around the world.

Steering Facebook towards greater transparencySteering Facebook towards greater transparency

Transparency is clearly an area where Facebook must urgently improve, and we want to be part of the solution. Our recommendations have repeatedly urged Facebook to follow some central tenets of transparency: make your rules easily accessible in your users’ languages; tell people as clearly as possible how you make and enforce your decisions; and, where people break your rules, tell them exactly what they’ve done wrong.

Since Facebook published its first responses to our recommendations earlier this year, we’ve already seen some early commitments from the company with the potential to increase transparency for users.

In May, Facebook committed to translate its Community Standards into Punjabi, a language spoken by 130 million people, by the end of this year. In June, Facebook pledged to include its satire exception, which was not previously communicated to users, in its Community Standards by the end of this year.

In August, Facebook agreed to provide information in its Transparency Center on content removed for violating its Community Standards following a formal report by a government, including the number of requests it receives. And this month, Facebook said it would “implement fully” our recommendation for an independent entity, not associated with either side of the Israel-Palestinian conflict, to conduct a thorough examination into whether its content moderation in Hebrew and Arabic – including its use of automation – has been applied without bias.

Of course, this is just a start. For these recommendations, and others, we will be monitoring whether and how Facebook lives up to the promises it has made. Over time, we believe that the combined impact of our recommendations will push Facebook to be more transparent and benefit users.

What’s nextWhat’s next

Pushing Facebook to be more transparent, to treat users fairly, and to honor its human rights commitments is a long-term effort that requires engagement by policymakers and regulators, civil society, researchers, and the media. The Oversight Board is part of this effort.

In areas where we feel that Facebook is falling short, such as transparency, we will keep challenging the company to do better. We will do this through our decisions, recommendations, and regular transparency reporting, including our annual report which we will publish next year.

Catalina Botero-Marino, Jamal Greene, Michael McConnell, Helle Thorning-Schmidt

Co-Chairs of the Oversight Board

Attachments

Quarterly transparency reports
Download
Back to news and articles