How we do our work
When people have exhausted Meta’s appeals process on Facebook, Instagram or Threads, they can challenge the company’s decision on a piece of content by appealing to the Oversight Board. Meta can also refer cases to us.
Once we have selected a case, our Board Members examine whether Meta’s decision to remove or leave up content was in line with its policies, values and human rights commitments. While we refer to Meta’s content policies and values as we consider cases, we sometimes question those policies when we believe they do not comply with Meta’s own commitment to protecting freedom of expression and other human rights.
When we come to a decision on a case, it is binding on Meta, unless implementing it could violate the law. As part of our decisions, we also make recommendations on the rules that apply to billions of Facebook, Instagram and Threads users, and how those rules are enforced.
Recommendations also form part of our policy advisory opinions, which we started publishing in 2022 to provide guidance to Meta on specific issues.
Watch Our Video On Policy Advisory OpinionsIncreasing Transparency around Content Moderation
To date, each decision and policy advisory opinion has brought transparency to content moderation processes that were often unknown or unclear. Policy recommendations have created public debate about how digital platforms can approach some of the most complex challenges in content moderation.
Given the uncharted path we are walking, the Board will continue to adapt, finding new ways to fulfill our mission. At its best, social media can bring about global connection and conversation. Keeping these benefits while limiting potential harms is a daunting task. We appreciate the scale of the challenge ahead but, together, we can help to overcome the pitfalls and help people connect with confidence.
Key Milestones
The timeline below captures some of the Oversight Board’s key milestones since we selected our first cases.
December 2020
Announce the selection of our first cases. From four continents, they cover issues including COVID-19 misinformation, dangerous individuals and hate speech.
January 2021
Publish our first five decisions, overturning Meta’s decision in four cases, while noting the enormous complexity of the issues involved.
April 2021
Start accepting user appeals to remove content that is live on Facebook and Instagram.
May 2021
In a decision related to former President Trump, we uphold his suspension from Facebook and Instagram, but criticize the “indefinite” penalty imposed.
June 2021
For the first time, we accept a request from Meta for a policy advisory opinion.
October 2021
Issue our first quarterly transparency report.
February 2022
As part of our first policy advisory opinion, we urge Meta to impose tighter restrictions on the sharing of private residential information.
July 2022
Meta makes an additional $150 million financial contribution to the Oversight Board Trust.
October 2022
Announce seven strategic priorities focusing on areas in which we can make the greatest impact. Additionally, we gain the ability to apply warning screens to eligible content.
November 2022
In our UK Drill Music decision, for the first time we examine a post removed after a request from national law enforcement.
December 2022
Including 32 proposals, our policy advisory opinion on Meta’s cross-check program highlights its flaws in key areas.
February 2023
Changes to our Charter and Bylaws allow us to take on expedited and summary decisions moving forwards. Expedited decisions will see an urgent case being examined in a matter of days.
June 2023
Publish first summary decisions, which examine cases in which Meta reverses its original decisions after we bring them to the company’s attention.
August 2023
Three new cases announced in a single month demonstrate the breadth of content issues considered by the Board, from violence in the community to Holocaust denial and harms to mental health.
December 2023
The Board issues two expedited decisions on the Israel-Hamas conflict.
February 2024
The Board expands its scope to include Threads.