The Oversight Board Expands to Threads
February 22, 2024
Today, the Oversight Board is expanding its scope to include Threads – a Meta app which allows users to share text updates and join public conversations. While we have been able to review and decide appeals from Facebook and Instagram users since October 2020, this marks a milestone: the first time the Board’s review has expanded its scope to cover a new app.
When social media platforms open up their content moderation decision-making to independent oversight this leads to better, more accountable decisions. This, in turn, builds trust with the people who use those platforms. The Board welcomes Meta’s decision to expand this approach to Threads. We believe we can help Threads be more transparent, take a global approach, and respect freedom of expression and other human rights.
What is Happening?
From today, people using Threads will be able to challenge Meta’s decisions by appealing eligible content to the Oversight Board. The appeals process for Threads is similar to the one for Facebook and Instagram. Once users have exhausted Meta’s internal appeals process, the company will issue an Oversight Board reference ID that allows users to submit their cases for review on the Oversight Board website. In addition to the 130 million people using Threads, Meta will also be able to refer cases about content on Threads to the Board.
How Will the Board Make Decisions About Content on Threads?
For Threads, as with Facebook and Instagram, we will select cases that have the potential to affect many people, are of critical importance to public discourse, or raise important questions about Meta’s policies. By tackling problems shared by millions of users, and proposing solutions to them, the impact of our work is felt far beyond individual cases.
As with Facebook and Instagram, the Board will be able to review cases on several different timelines, depending on the nature of the case, its timeliness, and its complexity. Some cases will be treated as standard decisions (on a 90-day timeline, with public comments), some cases (where Meta recognizes its error upon Board selection of the case) will be summary decisions, issued with only a brief opinion, and some cases (of special urgency) will be decided on an expedited basis.
Board Members will examine Meta’s content decisions based on Instagram’s Community Guidelines (which apply to Threads), the company’s values and its international human rights commitments, especially to the principles of freedom of expression. Our decisions will be binding, and Meta must implement them within seven days. We will also be able to make recommendations for how Meta can improve its approach to content moderation, which the company must respond to within 60 days.
The company has informed us that, as Threads is still evolving rapidly, for a period of 12 months, and while it stabilizes, Meta will be able to implement policy and enforcement recommendations from decisions about Threads but will not be able to implement product-specific recommendations.
Changing Meta’s Approach to Content Moderation
The Board’s track-record of impact on Facebook and Instagram suggests that we can also make an important difference on Threads.
For example, in response to our recommendations, Meta created Account Status which tells people what penalties Meta has applied to their account and why. The company has created new classifiers to stop breast cancer awareness content being automatically removed. And it is finalizing a new, consistent approach to preserving potential evidence of atrocities and serious violations of human-rights law. These are only a few instances of how the Board’s work is making Meta’s approach more transparent, more principled, and more global in outlook.
The Board will be bringing some four years of experience to its review of Threads. During this time, we have evolved our approach, such as the changes we made last year to take more cases, faster. In 2023, we issued important decisions about prisoners of war, former Cambodian Prime Minister Hun Sen and our first expedited cases about the Israel-Hamas conflict. We also published a policy advisory opinion about COVID-19 misinformation. We have carried this momentum into 2024, publishing decisions about Holocaust denial, manipulated media, and hate against transgender people. While there is more work to do, we will keep pushing Meta to treat its users fairly across all its platforms.
What’s Next?
When we select our first standard cases about content on Threads, we will announce this on our website and invite people to submit public comments. We will publish our final decisions on the Oversight Board website.
Note: The Oversight Board is an independent organization that examines Meta's decisions to remove or leave up content on Facebook, Instagram and Threads in a selected number of emblematic cases. The Board reviews and, where necessary, reverses the company's decisions. The Board’s decisions are binding on Meta. The Board also makes recommendations to improve Meta’s approach to content moderation.