A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board publishes transparency report for third quarter of 2022


December 2022

As part of the Oversight Board's commitment to transparency, each quarter, we share details of the cases that we are receiving from users, the decisions that we have taken and our recommendations to Meta.

Highlights from our Q3 2022 transparency report Highlights from our Q3 2022 transparency report

Today's report shows that in Q3 2022 (1 July to 30 September 2022), the Board received more than a quarter of a million (270,843) appeals from Facebook and Instagram users around the world. In this quarter, the Board issued two case decisions, both of which overturned Meta's decision on the content in question. Between them, these case decisions made 10 recommendations to Meta. The Board also selected six new cases in this quarter, and accepted a request from Meta for a policy advisory opinion on the removal of COVID-19 misinformation.

By selecting cases which raise issues faced by many other Facebook and Instagram users, and by making recommendations to address them, the Board has an impact which goes beyond individual pieces of content. The Board’s “Colombian police cartoon” decision, for example, was chosen as an opportunity to examine, and make recommendations on, Meta’s media matching banks, which can automatically remove images that violate Meta’s rules. After the Board selected this case, Meta also restored many other pieces of content featuring the same cartoon as the post under review.

1. Meta showed progress in sharing information with the Board

Meta is being increasingly open when providing the Board with information, helping the Board to increase its impact.

In this quarter, we have seen a welcome, continued increase in the level of technical detail in Meta's submissions and responses to the Board. This includes providing useful explanations of the company’s internal systems, including “strikes” and media matching banks, which help the Board to understand Meta’s approach to content moderation at scale. This enables the Board to make recommendations that affect more users, and to increase transparency through its decisions.

2. Meta is implementing more of the Board’s recommendations

Between Q2 and Q3 2022, the proportion of recommendations the Board deemed to have been either fully or partially implemented increased. Meta has now either fully or partially implemented more than one quarter (27%) of our recommendations (compared to 21% in Q2). For another third (34%), Meta has reported progress towards implementation. In total, 35 of the Board’s recommendations have now been fully or partially implemented, compared to 25 in Q2 2022.

In addition, for the second quarter in a row, 100% of the recommendations published in Q3 received “comprehensive” or “somewhat comprehensive” initial responses from Meta.

3. The Board’s recommendations continue to reshape Meta’s approach to content moderation

Our Q3 2022 transparency report also shows how the Board’s recommendations, many of which build on work that researchers, civil society groups and others have been doing for years, are leading to systemic changes in the company's approach.

  • In response to our repeated calls for Meta to tell people the specific rule that they had broken when removing their content, the company launched new notifications globally detailing specific policy violations for the Hate Speech, Dangerous Individuals and Organizations, and Bullying and Harassment policies. It is working to expand the messaging to all Community Standards and to multiple languages by the end of this year.
  • In response to several recommendations on the Dangerous Individuals and Organizations policy, Meta initiated an in-depth review of this policy area, focused on taking a risk-based approach to prioritizing designations of organizations and individuals. This means that those individuals or organizations assessed as being the highest risk would be prioritized for enforcement. Meta is also engaging in a policy development process on how it assesses whether content amounts to praise, substantive support, or representation of a designated individual or organization. Previously, the Board found that Meta’s definition of “praise” within the policy was too limiting of user expression. The policy development process is focused on allowing for more user voice in discussions of organizations and individuals encompassed under the policy, without sacrificing safety, in a way that can be equitably operationalized at-scale.
  • In its decision about a post reporting sexual violence against minors, the Board recommended that Meta look at whether to amend the Community Standards to prohibit the functional identification of child victims of sexual violence. In response, Meta has initiated an in-depth policy review of its approach to preventing the identification of victims of sexual violence in its Child Sexual Exploitation, Abuse and Nudity and Adult Sexual Exploitation policies.

What's next What's next

As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. We will also publish detailed information on our work in our annual report, which assesses Meta's performance in implementing our decisions and recommendations.

Attachments

Q3 2022 Quarterly Transparency Report
Download
Back to news and articles