Q2 2023 Transparency Report: Board’s Recommendations Lead to Key Changes in Meta’s Cross-Check Program
Today, the Oversight Board published its Transparency Report for Q2 2023. Alongside an overview of the Board’s activities in this quarter, this also includes data showing how recommendations from our policy advisory opinion on Meta’s Cross-Check Program have led to substantial and positive changes, benefiting the people who use Facebook and Instagram.
Enforcement Exemption List Reduced by More Than Half Enforcement Exemption List Reduced by More Than Half
In October 2021, coverage in the Wall Street Journal about Meta’s cross-check program raised questions about how the company was treating its most powerful users. This included a practice described as “allowlisting” or “whitelisting,” which exempted certain content from enforcement for specific policies. Meta refers to this practice as “technical corrections” and acknowledged in our policy advisory opinion that a “lack of governance over practices in the past, […] inadvertently resulted in some entities not receiving many enforcement actions.”
In response to our recommendations, Meta has now established clear criteria around these practices, including who should benefit from them. This new approach has already led to an immediate decrease in the overall size of the technical corrections list by more than half (55%).
Outstanding Backlogs Cleared for Content From Users on Meta’s Cross-Check ListsOutstanding Backlogs Cleared for Content From Users on Meta’s Cross-Check Lists
If a post from a user on Meta’s cross-check lists is identified as breaking the company’s rules, it remains on the platform pending further review. This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm. As the volume of content selected for cross-check can exceed Meta’s review capacity, in the past the program has operated with a backlog, which delays decisions on such content.
Our policy advisory opinion noted that, in September 2021, the Wall Street Journal reported that Brazilian soccer star Neymar posted non-consensual intimate imagery of another person on his Facebook and Instagram accounts. According to reporting by The Guardian, the video was online for over a day, and “an internal review of the Neymar posts found that the video was viewed 56 million times on Facebook and Instagram before removal,” despite representing a clear violation of Meta’s content policies. According to Meta, the reason for the prolonged accessibility of this violating content was a “delay in reviewing the content due to a backlog at the time.”
In our policy advisory opinion, we called on Meta not to operate cross-check at a backlog. Meta now reports that it has cleared all outstanding backlogs in its cross-check review queues dedicated to potentially violating content from entities on its lists, producing a 96% decrease in resolution time (time taken for review and any subsequent enforcement) for 90% of the jobs created in the first half of 2023, compared with the second half of 2022.
A smaller backlog means that Meta can review and take enforcement action against potentially violating content faster. This, in turn, helps the company reduce the risk of users being exposed to violating content while it is awaiting review.
For more details on our impact on Meta’s cross-check program, please see Meta’s Q2 2023 update on the Board here.
The Oversight Board in Q2 2023 The Oversight Board in Q2 2023
Our Q2 2023 Transparency Report also contains an overview of the Board’s activities during this period, including the following highlights:
- We published decisions on six cases in Q2 2023. Three of these were standard decisions: Armenian Prisoners of War, Brazilian General’s Speech and Cambodian Prime Minister, while a further three (Anti-Colonial Leader Amílcar Cabral, Metaphorical Statement Against the President of Peru and Dehumanizing Speech Against a Woman) were summary decisions. These examine cases in which Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention.
- We also published a policy advisory opinion on the Removal of COVID-19 Misinformation, including 18 recommendations. In combination with the recommendations from other case decisions published in Q2 2023, we made 30 recommendations in total during this quarter.
- We received 259 public comments to the Board ahead of the deliberations on our three standard cases and one policy advisory opinion (summary decisions do not consider public comments).
- Users submitted nearly 100,000 cases to the Board in Q2 2023, with close to 25% from Instagram users, the highest-ever proportion recorded from this platform.
Meta Implements More Recommendations Meta Implements More Recommendations
To ensure that Meta delivers on its commitments, we monitor the company’s progress towards implementing our recommendations, using our own, independent, data-driven approach. In our Q2 2023 Transparency Report, we note that the overall number of recommendations fully or partially implemented by Meta increased by 16 recommendations (rising from 44 in our Q1 report to 60 in our Q2 report), representing the largest-ever jump between the publication of quarterly reports.
Community Standards in More Languages Community Standards in More Languages
In April 2021, our Punjabi Concern over the RSS in India decision urged Meta to translate its Community Standards into all languages widely spoken by its users. Since then, Meta has translated Facebook’s rules into more than 20 additional languages (including Pashto and Somali, which Meta announced in its Q2 2023 quarterly update). This means that, since we made this recommendation, Meta has translated Facebook’s rules into languages spoken by more than a billion people worldwide.
What’s NextWhat’s Next
As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. These will include data about how our recommendations are changing Meta’s approach to content moderation.