H2 2023 Transparency Report: How the Board’s Ongoing Impact Is Making a Difference
March 19, 2024
Today, the Oversight Board published its Transparency Report for the second half of 2023. Alongside an overview of the Board’s activities during this period, we also demonstrate how the recommendations we have made since 2021 continue to push Meta to make changes and introduce new initiatives that improve how people and communities are treated around the world.
User Notifications
In two separate recommendations, the Oversight Board has urged Meta to give users the chance to remove and repost content without the part that will violate the Community Standards. In response, the company committed to exploring ways of sending users notifications before enforcement action is taken, and has now provided this supporting data. * Over a 12-week period in 2023, which saw pre-enforcement notifications being sent to users across 100 million pieces of content, users took the opportunity to delete their posts more than 20% of the time. For the Board, this shows that speech is being better protected with users able to adjust their posts to express the same thing without violating Meta’s rules and risking content takedowns.
* PLEASE NOTE: All information is aggregated and de-identified to protect user privacy. All metrics are estimates, based on best information available for a specific point in time.
Equitable Access to Data
To redress the lack of access to Meta’s public data, especially among researchers in Global South countries, the Oversight Board called on the company to ensure equitable access. Meta has now launched a Content Library, so that researchers in all parts of the world can apply for access to the company’s archive of public data on Facebook and Instagram.
Expediting Slurs Review for Countries With Elections
Last year, the Board noted that errors made in enforcing Meta’s Hate Speech policy, specifically around the company’s lists of slurs in countries with upcoming elections, was compromising news reporting on issues of public interest. This was particularly concerning coming into 2024, a historic election year. An expedited review of the slurs (offensive words/insults that could be violating) included on Meta’s lists for countries with imminent elections has now been completed, according to the company.
Preserving Evidence of Human Rights Violations
In 2021, the Board asked Meta to develop a protocol for keeping evidence of grave violations of international human rights and humanitarian law. Since then, the company has been finalizing a single approach to preserving evidence from its platforms of potential atrocity crimes and grave violations of human rights and humanitarian law, so this evidence can be shared with international courts and recognized authorities in the future.
Pushing for Transparency
Two of the Oversight Board’s decisions have included recommendations relating to government takedown requests of users content. We asked Meta to look at how it collects information on such requests to enable public reporting on them. Meta now says that its new system for tracking government requests stipulates that a standard set of questions must be answered, a step the company says will lead to improved transparency and capacity for public reporting.
Improving Automated Enforcement
In a previous decision, the Board expressed concern about content being left in Meta’s Media Matching Banks when it should not and recommended the company make changes to this service. The Media Matching Service automatically finds and removes images that have been identified by the company’s human reviewers as breaking Meta’s rules. However, when there have been successful user appeals to prove that content does not break the rules, the Board said this content should be reassessed for removal from the banks. Meta has been refining this part of its automated enforcement system to improve its record and reassess such content.
Latest on Recommendations
Since 2021, Meta has fully or partially implemented or has reported progress on 146 of our 251 recommendations,† as verified by data provided to the Board, which we assess using our own independent method.
† PLEASE NOTE: Recommendations numbers up to date as of March 4, 2023.
Highlights From the Second Half of 2023
From July to December, our activities included:
- A total of 43 decisions issued, comprising two expedited decisions, 15 standard decisions and 26 summary decisions.
- At the end of 2023, our first two expedited decisions saw the Board focusing on protecting the right of expression of people on all sides of the ongoing Israel-Hamas conflict. To respond to urgent and select cases, the Board can carry out expedited reviews.
- The Board also published more than 25 summary decisions, enabling it to shed more light on the corrections that Meta is making in direct response to our work, and highlighting areas in which the company needs to do more to improve its enforcement and reduce those errors in the first place. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention.
- In addition to reinforcing our existing recommendations, summary decisions are already directly influencing Meta’s internal decision-making. For example, one of our 2023 summary decisions, Metaphorical Statement Against the President of Peru, was considered by Meta before it reversed its original decision in the Iranian Woman Confronted on Street case – a decision we have very recently issued (March 2024). In this case, the Board has pushed Meta to be more sensitive to contexts when considering figurative (non-literal) speech, especially in countries where state repression is a factor.
- A total of 16 recommendations made, split almost evenly across Meta’s policies and how it enforces those policies.
During this period, we also:
- Overturned Meta in 82% of our decisions.
- Received 245 public comments from around the world.
What’s Next
As part of our commitment to transparency, we will be publishing transparency reports on a biannual basis. They will include data about how our recommendations continue to change Meta’s approach to content moderation.