A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

2022 Annual Report: Oversight Board reviews Meta’s changes to bring fairness and transparency to its platforms


June 2023

Today, we are publishing our 2022 Annual Report. This gives a comprehensive account of the Board’s work in 2022, including the publication of our first policy advisory opinions and the growing impact of our recommendations on how people experience Facebook and Instagram.

Read our Annual Report in English here.

Read our Annual Report in Spanish here.

Note: we will also be publishing this report in Arabic, Chinese, French, Hindi, and Russian in the coming weeks.

Turning commitments into impactTurning commitments into impact

Since we published our first decisions in January 2021, Meta has implemented many of our recommendations. These changes have made the company more transparent and fairer in its treatment of its users.

From January 2021 through early April 2023, when this Annual Report was finalized, the Board made a total of 191 recommendations to Meta. For around two-thirds of these, Meta has either fully or partially implemented the recommendation, or reported progress towards its implementation.

In 2022, it was encouraging to see that, for the first time, Meta made systemic changes to its rules and how they are enforced, including on user notifications and its rules on dangerous organizations. While we know there is still more to do, today’s Annual Report highlights several examples of the Board’s impact so far.

Tell people why their content was removedTell people why their content was removed

As a Board, the recommendation we have made most often is for Meta to tell people why their content was removed.

In the past, we have seen users left guessing about why Meta removed their content. In response to our recommendations, Meta has introduced new messaging globally telling people the specific policy they violated for its Hate Speech, Dangerous Individuals and Organizations, and Bullying and Harassment policies. In response to a further recommendation, Meta also completed a global rollout of messaging telling people whether human or automated review led to their content being removed. We believe that giving people more information on how and why their content was removed will build trust and improve fairness in how Meta applies its rules.

Make your rules more consistent and transparentMake your rules more consistent and transparent

Today’s Annual Report also shows how our recommendations are making Meta’s rules clearer and more consistent.

In response to our proposals, Meta introduced a Crisis Policy Protocol to make its responses to crisis situations more consistent, and recently announced plans to create a new crisis coordination team to provide dedicated 24/7 operations oversight for imminent and emerging crises.

Meta also launched a review of its Dangerous Individuals and Organizations policy and created a new Community Standard on misinformation, consolidating and clarifying its rules in one place. Finally, following our concerns around Meta’s opaque penalty system and user concerns about being placed in “Facebook jail,” the company changed its ‘strikes’ system to make it fairer and more transparent.

Protect important expression from journalists, protesters, and breast cancer awareness campaignersProtect important expression from journalists, protesters, and breast cancer awareness campaigners

Our Annual Report also shows how our recommendations are helping to protect important expression from journalists, protesters, and breast cancer awareness campaigners on Facebook and Instagram.

In response to our policy advisory opinion on Meta’s cross-check program, for example, the company committed to extend greater protections to those at particular risk of having their content wrongly removed, including journalists and human rights defenders.

We also protected the voice of users during political and social transformations. As part of our “Iran protest slogan,” decision, we urged Meta to better protect political speech in Iran, where historic protests have been violently suppressed. In response, Meta allowed the term “Marg bar Khamenei” (which literally translates as “Death to [Iran’s supreme leader] Khamenei”) to be shared in the context of ongoing protests in Iran and reversed previous enforcement actions against this kind of content.

Finally, in our “breast cancer symptoms and nudity” decision, we urged Meta to improve how it detects images with text-overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. In response, Meta enhanced its techniques for identifying breast cancer context in content on Instagram.

These enhancements have been in place since July 2021 and, between February 26 and March 27, 2023, contributed to an additional 2,500 pieces of content being sent for human review that would have previously been removed. While it is difficult to contextualize this without a denominator, this first impact metric shows how our recommendation helped Meta mitigate a systemic risk resulting from the way the company was handling this kind of content.

A new, transparent dialogueA new, transparent dialogue

By publicly making these recommendations, and publicly monitoring Meta’s responses and implementation, we have opened a space for transparent dialogue with the company that did not previously exist. We hope that our recommendations will trigger public discourse about how platforms can approach some of the most complex challenges in content moderation.

In many cases, our proposals echo, or build upon, calls that civil society groups and other stakeholders have been making for many years — forcing Meta to consider and respond publicly to longstanding calls for action. As a Board, we would like to reiterate our gratitude to these organizations for sharing their ideas and expertise.

The Board’s work in 2022The Board’s work in 2022

In addition to highlighting the Board’s growing impact on Meta, our Annual Report also provides an overview of cases submitted to the Board.

In 2022, we received nearly 1.3 million appeals from users to restore or remove content on Facebook or Instagram, an increase of around a quarter compared to 2021. More than two-thirds of appeals to restore content in 2022 concerned posts removed under Meta’s rules on violence and incitement or hate speech.

After the Board’s Case Selection Committee shortlists cases for Board review, Meta sometimes determines that its original decision on a piece of content was incorrect and reverses it. In 2022, Meta identified that its original decision was incorrect in 32 of the 50 cases shortlisted by the Board. While this is only a small sample, and the Board intentionally seeks out the most challenging and difficult cases, it is concerning that in nearly two-thirds of shortlisted cases, Meta found its original decision to have been incorrect.

Our Annual Report also outlines how, in 2022, we:

  • Published our first policy advisory opinions on sharing private residential information and Meta’s cross-check program.
  • Overturned Meta’s original decision on a piece of content in three-quarters of our case decisions.
  • Received more than 600 public comments from individuals and organizations around the world.
  • Expanded our scope to include the ability to add warning screens to eligible content.
  • Announced seven strategic priorities: elections and civic space, crisis and conflict situations, gender, hate speech against marginalized groups, government use of Meta’s platforms, treating users fairly, and automated enforcement of policies and curation of content.

Evolving our work with MetaEvolving our work with Meta

In February 2023, we announced plans to review more cases, faster. Increasing the number of decisions we produce, and the speed at which we do so, will let us to tackle more of the big challenges of content moderation. Since February, we have announced 10 new cases for consideration, as well as a policy advisory opinion on the Arabic term “Shaheed” (which often translates as “martyr”). We will publish these in the coming weeks and months.

Our Annual Report sets six goals to help us increase our impact in 2023. These include publishing our first summary decisions, which examine cases where Meta reversed its original decision on a piece of content after we brought the case to the company’s attention. We also plan to issue our first expedited decisions in 2023, which review Meta’s decision on a post within days, and look forward to receiving our first expedited case from the company. We also aim to complete the initial composition of the Board, deepen engagement around our seven strategic priorities, pursue long-term plans for scope expansion, and push Meta to provide evidence of implementation and impact.

Sharing the benefits of independent oversightSharing the benefits of independent oversight

From the outset the Board was designed to test an independent approach to content moderation, which, if successful, could also be applied to other companies. In the last three years, we have acquired a wealth of experience on independent oversight that can help companies make more robust decisions based on respect for freedom of expression and other human rights.

Our Annual Report shares thoughts on how a content governance oversight board should function, ranging from the importance of diversity to the value of international human rights standards in decision-making. As new regulation brings new requirements, there are also specific areas, such as transparency and user notifications, where we believe we can provide part of the solution.

What’s nextWhat’s next

As part of our commitment to transparency, we also publish quarterly transparency reports throughout the year. In the coming days, we will be publishing our Q1 2023 quarterly transparency report. This will highlight further examples of the Board’s impact on Meta and will include a review of Meta’s January 2023 announcement on former President Trump.

Attachments

2022 Annual Report in English
Download
2022 Annual Report in Spanish
Download
Implementation Annex
Download
2022 Annual Report in Arabic
Download
2022 Annual Report in Chinese
Download
2022 Annual Report in French
Download
2022 Annual Report in Hindi
Download
2022 Annual Report in Russian
Download
Back to news and articles