A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board announces seven strategic priorities


October 2022

In October 2020, the Oversight Board started accepting cases. For the first time, if users had their post removed from Facebook or Instagram and had exhausted the company’s appeals process, they could challenge this decision by appealing to an independent, outside body.

Two years on, we are announcing seven strategic priorities where we want to work with stakeholders to reshape Meta’s approach to content moderation. This will increase our impact in the areas where we can make the biggest difference to how people experience Facebook and Instagram. In addition, as new regulation brings new requirements for tech companies, we stand ready to partner with companies, regulators, and others to share the benefits of an independent, principled approach to content moderation.

Introducing our strategic prioritiesIntroducing our strategic priorities

Our seven strategic priorities were chosen based on an analysis of cases submitted to the Board, and issues facing users globally. As these priorities will guide the cases the Board selects for review, we encourage people to consider these priorities when submitting appeals to the Board, including when explaining their reasons for disagreeing with Meta’s decisions.

Our strategic priorities are:

  • Elections and civic space

Social media companies face challenges in consistently applying their policies to political expression in many parts of the world, including during elections and large-scale protests. We highlighted the importance of protecting political expression in our “pro-Navalny protests in Russia” decision, while our “mention of the Taliban in news reporting” decision touched upon issues of media freedom. The Board is interested in exploring Meta’s responsibilities in elections, protests, and other key moments for civic participation.

  • Crisis and conflict situations

In times of crisis, such as armed conflict, terrorist attacks, and health emergencies, social media can help people stay safe, but it can also create an environment where misinformation and hatred can spread. Our “alleged crimes in Raya Kobo” and “Tigray Communication Affairs Bureau” decisions examined posts related to the conflict in Ethiopia, while our decision on former President Trump led Meta to adopt a Crisis Policy Protocol. The Board is interested in exploring Meta’s preparedness for potential harms its products can contribute to during armed conflicts, civil unrest, and other emergencies.

  • Gender

Women, non-binary, and trans people experience obstacles to exercising their rights to freedom of expression on social media. In our “breast cancer symptoms and nudity” decision, for example, Meta’s automated systems failed to apply exceptions for breast cancer awareness, which lead to important health information being removed from Instagram. The Board is interested in exploring gendered obstacles women and LGBTQI+ people face in exercising their rights to freedom of expression, including gender-based violence and harassment, and the effects of gender-based distinctions in content policy.

  • Hate speech against marginalized groups

Hate speech creates an environment of discrimination and hostility towards marginalized groups. It is often context-specific, coded, and with harm resulting from effects which gradually build up over time. Our “Depiction of Zwarte Piet” decision found that allowing images of blackface to accumulate would create a discriminatory environment for Black people, while our “Wampum belt” and “Reclaiming Arabic words” decisions examined ‘counter speech’ which referenced hate speech to resist discrimination. The Board is interested in exploring how Meta should protect members of marginalized groups, while ensuring its enforcement does not incorrectly target those challenging hate.

  • Government use of Meta’s platforms

Governments use Facebook and Instagram to convey their policies and to request that Meta removes content. In response to our “Öcalan’s isolation” decision, Meta agreed to provide information on content removed for violating its Community Standards following a report by a government, and our forthcoming “UK drill music” decision will look at how Meta should respond to requests from national law enforcement. The Board is interested in exploring how state actors use Meta’s platforms and the implications for content moderation.

  • Treating users fairly

When people’s content is removed from Facebook and Instagram, they are not always told which rule they have broken, and a lack of transparency can create the perception that users are not treated equally. The Board is interested in exploring how Meta can do more to treat its users as customers, through providing more specific user notifications, ensuring people can always appeal Meta’s decision to the company, and being more transparent in areas such as ‘strikes’ and cross-check.

  • Automated enforcement of policies and curation of content

While algorithms are crucial to moderating content at scale, there is a lack of transparency around how Meta’s automated systems work and how they affect the content users see. Our “Colombian police cartoon” decision showed how automation can amplify the impact of incorrect content moderation decisions. In response to our “breast cancer symptoms and nudity” decision, Meta is piloting notifications telling people whether it took enforcement action due to human or automated review. The Board is interested in exploring how automated enforcement should be designed, the accuracy of automated systems, and the importance of greater transparency in this area.

Working with stakeholders to increase our impactWorking with stakeholders to increase our impact

As a Board, our achievements so far have been made possible by the efforts of researchers, civil society groups and others who have worked on these issues for many years. To find lasting solutions to our strategic priorities, and the enormously challenging issues they raise, the subject-matter expertise and local knowledge of these stakeholders will be essential.

For all strategic priorities, we will continue to work with stakeholders to understand the policies and enforcement practices Meta most urgently needs to improve, and what kind of cases could provide the opportunity to address these.

We want to partner with organizations across the world to do this - through our public comments process, roundtables, and individual conversations. To discuss how your organization can get involved, please sign up for updates from the Board here or contact engagement@osbadmin.com.

Championing an independent, principled approachChampioning an independent, principled approach

Finding solutions to the issues raised by our strategic priorities matters not just to Meta, but to all social media companies. New regulation is also bringing new requirements for tech companies, which will require many to change their approach. While there is no single answer and many actors will need to play their part, we believe that the Board’s independent, human rights-based approach can be part of the solution.

Today, social media companies face a significant challenge when moderating content: which rules should apply to billions of people of different nationalities, languages, and cultures? We think international human rights standards are a crucial part of the answer. These apply equally to everyone and provide a consistent framework to consider a user’s right to free expression alongside other human rights, like the right to life or privacy.

We are also seeing growing acceptance, in the tech industry and beyond, of the value of independent oversight. This is about firms opening up their internal processes and inviting outsiders, with no commercial interests in the company, to review their decisions. We believe that this kind of scrutiny leads to better and more robust decisions in the long-term.

As a Board, we have developed a wealth of experience in the last two years. We hope that the Board’s independent, human rights-based approach provides a credible framework for other social media companies, regulators, and civil society organizations, and we stand ready to build new partnerships and share what we have learned in our common endeavor to improve social media.

What’s nextWhat’s next

In the last two years, we have received nearly two million appeals from users. Through the 130 recommendations we have made so far, we have started to reshape Meta’s approach to content moderation. Today, in announcing our strategic priorities, we want to deepen our impact, working with stakeholders around the world to find lasting solutions. We also stand ready to partner with companies, regulators, and others to share the benefits of an independent, principled approach to content moderation.

Attachments

Overarching Criteria for Case Selection
Download
Back to news and articles