Oversight Board publishes transparency report for first quarter of 2022
4 août 2022
Today we are issuing a transparency report for the first quarter of 2022, which you can read here. As part of the Oversight Board’s commitment to transparency, each quarter we share details of the cases that we are receiving from users, the decisions that we have taken and our recommendations to Meta.
Today’s report shows that in Q1 2022 (January 1 to March 31, 2022) the number of appeals sent to the Board by users increased by two thirds compared to the previous quarter, with nearly 480,000 cases submitted. In this quarter, the Board published two case decisions, which both overturned Meta’s decision on the content in question, and issued one policy advisory opinion on the sharing of private residential information. We received 69 public comments for these cases, which, between them, made 22 recommendations to Meta.
Highlights from our Q1 2022 transparency report
1. The number of cases submitted to the Board increased by two-thirds
From January to March 2022, we estimate that users submitted nearly 480,000 cases to the Board. This represents an increase of two-thirds on the 288,440 cases submitted in the fourth quarter of 2021. In total, users submitted more than 1.6 million cases to the Board between October 2020, when we started accepting appeals, and March 2022.
The fact that so many Facebook and Instagram users are sending cases to the Board shows the enormous demand to appeal content moderation decisions made by Meta to an organization independent from the company.
2. Appeals to restore content removed under Meta’s violence and incitement rules continue to increase
In the first quarter of 2022, users who wanted their content restored mainly submitted appeals to the Board about content Meta removed under its rules on violence and incitement (44%), hate speech (23%), and bullying and harassment (21%). These figures only concern user appeals to restore content which Meta already deemed to have broken its rules and not appeals to remove other people’s content still live on Facebook or Instagram, because the latter has supposedly not violated a Community Standard.
As the red line shows in the graph above, appeals to the Board to restore content removed under Meta’s rules on violence and incitement have increased significantly – rising from 9% in Q4 2020, to 18% in Q2 2021, 29% in Q4 2021 and 44% in Q1 2022.
Meta’s own statistics also show that between Q4 2021 and Q1 2022, the amount of content it took action on for breaking its rules on violence and incitement rose by three-quarters – increasing from 12.4 million posts to 21.7 million.
3. In 14 out of 20 cases shortlisted by the Board in Q1 2022, Meta decided that its original decision on the piece of content was incorrect
After the Board shortlists cases for potential review, Meta sometimes determines that its original decision on a piece of content was incorrect. Meta reversed its decision in 14 out of 20 cases shortlisted by the Board in this quarter – restoring 10 posts to Facebook and Instagram and removing four.
While this is only a small sample, and the Board intentionally seeks out challenging and borderline cases, for 70% of the cases shortlisted in the first quarter of 2022 (14 out of 20 shortlisted cases), Meta found its original decision to have been incorrect. This is significantly higher than the error rate for cases shortlisted from Q4 2020 to Q4 2021 which was 39% (51 out of 130 shortlisted cases). The Board is raising with Meta the questions this poses for the accuracy of its content moderation and the appeals process the company applies before cases reach the Board.
4. Meta made progress in implementing our recommendations
As part of our cases, the Board makes recommendations to Meta and asks the company questions. This dialogue between the Board and Meta is helping to create a space which did not previously exist. New information provided in response to our questions and included in our case decisions is helping to enrich debate about the company. And in response to our recommendations, the company is making changes to improve how it treats the people who use its platforms.
Our recommendations have repeatedly urged Meta to be clear with people about why it removed their posts. In response, the company is giving people using Facebook in English who break its hate speech rules more detail on what they’ve done wrong and is expanding this specific messaging to more violation types.
In response to a recommendation in our policy advisory opinion on sharing private residential information, Meta also committed to remove the exception that allows the sharing of private residential information when it is considered “publicly available” – a crucial step for ensuring that users’ privacy is protected on Facebook and Instagram.
Meta also made progress towards implementing several of our past recommendations. In response to a recommendation in our “ Claimed COVID cure” decision, Meta created a new Community Standard on misinformation, which includes health misinformation. Putting misinformation policies in one place will make Meta’s rules more accessible to users and help people follow them. In response to other recommendations, Meta has added text about its satire exceptions across several Community Standards, and provided further information on how users can make the intent behind their posts clear. The company also expects to bring Facebook’s and Instagram’s policies into alignment by the end of this year.
What’s next
As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. We will also publish detailed information on our work in our annual report, which assesses Meta’s performance in implementing our decisions and recommendations.