Case Decisions and Policy Advisory Opinions
The Oversight Board reviews Meta’s content decisions to see if the company acted in line with its own policies, values and human rights commitments. The Board can choose to overturn or uphold Meta’s decision.
Case Decisions
All three types result in binding decisions that Meta must implement.
Standard
In-depth review of Meta’s decision to remove or allow a post, which includes recommendations.
Summary
Analysis of Meta’s original decision on a post when the company later changes its mind, after the Board selects the case for review.
Expedited
Rapid review of Meta’s decision on a post in exceptional situations with urgent real-world consequences.
Standard
Multiple Case Decision
2024-038-FB-UA, 2024-039-FB-UA, 2024-040-FB-UA
Footage of Moscow Terrorist Attack
The Board has overturned Meta’s decisions to remove three Facebook posts showing footage of the March 2024 terrorist attack in Moscow, requiring the content to be restored with “Mark as Disturbing” warning screens.
Summary
Overturned
2024-057-FB-UA
Derogatory Image of Candidates for U.S. Elections
A user appealed Meta’s decision to remove content containing an altered and derogatory depiction of U.S. presidential candidate Kamala Harris and her running mate Tim Walz.
Standard
Overturned
2024-041-FB-UA
Homophobic Violence in West Africa
The Oversight Board is seriously concerned about Meta’s failure to take down a video showing two men who appear to have been beaten for allegedly being gay. In overturning the company’s original decision, the Board notes that by leaving the video on Facebook for five months, there was a risk of immediate harm by exposing the men’s identities, given the hostile environment for LGBTQIA+ people in Nigeria.
POLICY ADVISORY OPINIONS
Meta can also ask the Board for guidance on specific issues through policy advisory opinions. These are integrated into the company’s policy development process.
Published
PAO-2023-01
Referring to Designated Dangerous Individuals as “Shaheed”
This policy advisory opinion analyzes Meta's approach to moderating the word “shaheed,” raising important questions about the impact of the Dangerous Organizations and Individuals policy on freedom of expression.
Published
PAO-2022-01
Removal of COVID-19 misinformation
This policy advisory opinion examines whether Meta should continue to remove certain categories of COVID-19 misinformation, or whether a less restrictive approach would better align with its values and human rights responsibilities.
Published
PAO-2021-02
Meta’s cross-check program
This policy advisory opinion analyzes Meta’s cross-check program, raising important questions around how Meta treats its most powerful users.