Oversight Board Publishes Transparency Report for Second Quarter of 2022 and Gains Ability to Apply Warning Screens
October 20, 2022
Today we are issuing a transparency report for the second quarter of 2022, which you can read here.
The Board is also gaining the ability to make binding decisions to apply a warning screen when leaving up or restoring qualifying content.
Highlights From Our Q2 2022 Transparency Report
As part of the Oversight Board’s commitment to transparency, each quarter we share details of the cases that we are receiving from users, the decisions that we have taken and our recommendations to Meta.
Today’s report shows that in Q2 2022 (April 1 to June 30, 2022) the Board received 347,000 appeals from Facebook and Instagram users around the world. In this quarter, the Board published three case decisions. Two of these overturned Meta’s decision on the content in question, while one upheld it. Between them, these case decisions made 10 recommendations to Meta.
1. The Board has received nearly two million appeals since October 2020
From April to June 2022, we estimate that users submitted around 347,000 appeals to the Board. This means that, since we started accepting appeals two years ago, we have received nearly two million appeals from users around the world. This demonstrates the ongoing demand from users to appeal Meta’s content moderation decisions to an independent body.
2. Meta showed progress in how it responded to our questions and recommendations
In the last two years, the Board has opened a transparent space for dialogue with Meta which did not previously exist. Our Q2 2022 transparency report shows notable improvements in this regard.
In this quarter, Meta continued to answer the vast majority of the questions we asked as part of our case decisions. Of the 59 questions asked, Meta answered 51, partially answered seven, and did not answer one. Where Meta partially answered or did not answer our questions, the company gave constructive responses, explaining why certain data-related questions would take a long time to answer and showing openness to providing data insights in the future. In addition, Meta has already wholly or partially implemented four out of the 10 recommendations we made in Q2 2022, demonstrating a marked increase in responsiveness from the company.
3. The Board's recommendations are continuing to reshape Meta's approach to content moderation
Our Q2 2022 transparency report also shows how Meta’s responses to our past recommendations are leading to systemic changes in the company’s approach.
- In April 2021, our “Punjabi concern over the RSS in India” decision urged Meta to translate its Community Standards into all languages widely spoken by its users. In response, Meta has translated its rules into 15 Asian and African languages, including Farsi, Hausa, and Punjabi. As a result of this recommendation, around 800 million people in Global Majority countries can now read Meta’s rules in their native language.
- In response to our recommendation that Meta tell people the specific rule they had broken when removing their content, Meta is now systematically measuring the level of detail of its user communications for content removals across all Community Standards. In response to the Board’s focus on users more generally, Meta also announced it would be building a customer services division to help users whose posts or accounts had been removed.
- In response to a Board recommendation in the “shared Al Jazeera post” decision, Meta released the findings of an independent due diligence report on the impact of the company’s policies in Israel and Palestine during the May 2021 conflict. According to the report, Meta’s actions in May 2021 appear to have had an adverse human rights impact on the rights of Palestinian users to freedom of expression and their ability to share information and insights about their experiences as they occurred.
New Powers to Apply Warning Screens
Expanding the Board’s scope is also central to increasing our impact. Having previously gained the ability to review user appeals to remove content, starting this month the Board will be able to make binding decisions to apply a warning screen when leaving up or restoring qualifying content, including to photo and video content. Warning screen types include marking content as ‘disturbing’ or ‘sensitive.’ We have published an amended version of our Bylaws to reflect this. We have also made changes to our Rulebook to provide the Board with more flexibility in preparing policy advisory opinions.
What’s Next
As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. We will also publish detailed information on our work in our annual report, which assesses Meta’s performance in implementing our decisions and recommendations.