Upheld
Tigray Communication Affairs Bureau
The Oversight Board has upheld Meta’s decision to remove a post threatening violence in the conflict in Ethiopia
This decision is also available in Amharic, Oromo and Tigrinya.
ሙሉ ውሳኔውን በአማርኛ ለማንበብ፣ እዚህ ይጫኑ።
Murtii guutuu kan Afaan Oromoo dubbisuuf as tuqi
ብትግርኛ እተገብረ ውሳነ ምሉእ ከተንብቦ እንተ ደሊኻ ኣብዚ ጠውቕ።
Case summary
The Oversight Board has upheld Meta’s decision to remove a post threatening violence in the conflict in Ethiopia. The content violated Meta's Violence and Incitement Community Standard and removing it is in line with the company's human rights responsibilities. Overall, the Board found that Meta must do more to meet its human rights responsibilities in conflict situations and makes policy recommendations to address this.
About the case
On February 4, 2022, Meta referred a case to the Board concerning content posted on Facebook during a period of escalating violence in the conflict in Ethiopia, where Tigrayan and government forces have been fighting since November 2020.
The post appeared on the official page of the Tigray Regional State’s Communication Affairs Bureau and was viewed more than 300,000 times. It discusses the losses suffered by federal forces and encourages the national army to “turn its gun” towards the “Abiy Ahmed group.” Abiy Ahmed is Ethiopia’s Prime Minister. The post also urges government forces to surrender and says they will die if they refuse.
After being reported by users and identified by Meta’s automated systems, the content was assessed by two Amharic-speaking reviewers. They determined that the post did not violate Meta’s policies and left it on the platform.
At the time, Meta was operating an Integrity Product Operations Centre (IPOC) for Ethiopia. IPOCs are used by Meta to improve moderation in high-risk situations. They operate for a short time (days or weeks) and bring together experts to monitor Meta's platforms and address any abuse. Through the IPOC, the post was sent for expert review, found to violate Meta’s Violence and Incitement policy, and removed two days later.
Key findings
The Board agrees with Meta’s decision to remove the post from Facebook.
The conflict in Ethiopia has been marked by sectarian violence, and violations of international law. In this context, and given the profile and reach of the page, there is a high-risk the post could have led to further violence.
As a result, the Board agrees that removing the post is required by Meta’s Violence and Incitement Community Standard, which prohibits “statements of intent to commit high-severity violence.” The removal also aligns with Meta’s values; given the circumstances, the values of “Safety” and “Dignity” prevail over “Voice.” The Board also finds that removal of the post aligns with Meta’s human rights responsibilities and is a justifiable restriction on freedom of expression.
Meta has long been aware that its platforms have been used to spread hate speech and fuel violence in conflict. The company has taken positive steps to improve content moderation in some conflict zones. Overall however, the Board finds that Meta has a human rights responsibility to establish a principled, transparent system for moderating content in conflict zones to reduce the risk of its platforms being used to incite violence or violations of international law. It must do more to meet that responsibility.
For example, Meta provides insufficient information on how it implements its Violence and Incitement policy in armed conflict situations, what policy exceptions are available or how they are used. Its current approach to content moderation in conflict zones suggests inconsistency; observers have accused the company of treating the Russia-Ukraine conflict differently to others.
While Meta says it compiles a register of “at-risk” countries, which guides its allocation of resources, it does not provide enough information for the Board to evaluate the fairness or efficacy of this process. The IPOC in this case led to the content being removed. However, it remained on the platform for two days. This suggests that the “at-risk” system and IPOCs are inadequate to deal with conflict situations. According to Meta, IPOCs are "not intended to be a sustainable, long-term solution to dealing with a years-long conflict.” The Board finds Meta may need to invest in a more sustained mechanism.
The Oversight Board’s decision
The Oversight Board upholds Meta’s decision to remove the post.
The Board also makes the following recommendations:
- Meta should publish information on its Crisis Policy Protocol in the Transparency Center.
- Meta should assess the feasibility of establishing a sustained internal mechanism that provides it with the expertise, capacity and coordination required to review and respond to content effectively for the duration of a conflict.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
1. Decision summary
The Oversight Board upholds Meta’s decision to remove the content from Facebook for violating the Violence and Incitement Community Standard. The Board finds that removing the content in this case is consistent with Meta’s human rights responsibilities in an armed conflict. The Board also finds that Meta has a responsibility to establish a principled and transparent system for moderating content in conflict zones to mitigate the risks of its platforms being used to incite violence or commit violations of international human rights and humanitarian law. The Board reiterates the need for Meta to adopt all measures aimed at complying with its responsibility to carry out heightened human rights due diligence in this context.
2. Case description and background
On February 4, 2022, Meta referred a case to the Board concerning content posted on Facebook on November 5, 2021. The content was posted by the Tigray Communication Affairs Bureau page, which states that it is the official page of the Tigray Regional State Communication Affairs Bureau (TCAB). The content was posted in Amharic, the Federal Government’s official working language. The TCAB is a ministry within the Tigray regional government. Since November 2020, the Tigray People’s Liberation Front (TPLF) and the Federal Democratic Republic of Ethiopia (“Federal Government”) have been engaged in an armed conflict. The TPLF is the ruling party in Tigray, while the Tigray Defense Forces is the TPLF’s armed wing.
The post discusses the losses suffered by the Federal National Defense Forces under the leadership of Prime Minister Abiy Ahmed in the armed conflict with the TPLF. The post encourages the national army to “turn its gun towards the fascist Abiy Ahmed group” to make amends to the people it has harmed. It goes on to urge the armed forces to surrender to the TPLF if they hope to save their lives, adding: “If it refuses, everyone should know that, eventually, the fate of the armed forces will be death.”
Tensions between the Federal Government and the TPLF reached their peak when the Federal Government postponed the elections in 2020, citing the coronavirus pandemic as the reason for the delay. Opposition leaders accused the Prime Minister of using the pandemic as an excuse to extend his term. Despite the Federal Government announcement, the Tigray regional government proceeded to conduct elections within the region, where the TPLF won by a landslide.
Prime Minister Abiy Ahmed announced a military operation against Tigrayan forces in November 2020 in response to an attack on a federal military base in Tigray. Federal forces pushed through to take Tigray’s capital, Mekelle. After eight months of fighting, federal forces and their allies withdrew from Mekelle and the TPLF retook control. In May 2021, the Federal Government designated the TPLF a terrorist organization.
On November 2, 2021, days before the content was posted, the Prime Minister imposed a nationwide state of emergency after the TPLF took over certain parts of the Amhara and Afar regions, beyond Tigray. The Federal Government also called on citizens to take up arms as the TPLF made its way towards the capital, Addis Ababa. On the day the content was posted on November 5, nine opposition groups, including the TPLF, created an alliance to put pressure on the Federal Government and oust the Prime Minister.
The TCAB page has about 260,000 followers and is set to public, meaning it can be viewed by any Facebook user. It is verified by a blue checkmark badge, which confirms that the page or profile is the authentic presence of a person or entity. The content was viewed more than 300,000 times and shared fewer than 1,000 times.
Since November 5, the content was reported by 10 users for violating the Violence and Incitement, Dangerous Individuals and Organizations, and Hate Speech policies. Additionally, Meta’s automated systems identified the content as potentially violating, and sent it for review. Following review by two human reviewers, both of whom were Amharic speakers, Meta determined that the content did not violate its policies and did not remove it from the platform.
On November 4, a day before the content was posted, Meta convened an Integrity Product Operations Center (IPOC) to monitor and respond in real time to the rapidly unfolding situation in Ethiopia. According to Meta, an IPOC is a group of subject matter experts within the company brought together for a short period to provide real-time monitoring and address potential abuse flowing across Meta’s platforms. Through the IPOC, the content was escalated for additional review by policy and subject matter experts. Following this review, Meta determined the content violated the Violence and Incitement policy, which prohibits “statements of intent to commit high severity violence.” The content remained on the platform for approximately two days before it was removed.
Since the beginning of the conflict in November 2020, there have been credible reports of violations of international human rights and humanitarian law by all parties to the conflict. The Report of the joint investigation of the Ethiopia Human Rights Commission and Office of the United Nations High Commissioner for Human Rights found documented instances of torture and other forms of cruel, inhuman, or degrading treatment, extrajudicial executions of civilians and captured combatants, kidnappings, forced disappearances, and sexual and gender-based violence, among other international crimes (see also Ethiopia Peace Observatory). The joint investigation team found that persons taking no direct part in the hostilities were killed by both sides to the conflict. This included ethnic-based and retaliatory killings. Both federal forces and Tigrayan forces “committed acts of torture and ill-treatment against civilians and captured combatants in various locations in Tigray, including in military camps, detention facilities, victims’ homes, as well as secret and unidentified locations.” Individuals perceived to be affiliated with the TPLF were forcefully disappeared or arbitrarily detained, and the wives of disappeared or detained men subject to sexual violence by federal armed forces. Similarly, wives of members of the federal armed forces were sexually assaulted or raped by Tigrayan combatants. Many people were gang-raped. Federal armed forces also refused to facilitate access to humanitarian relief in conflict-affected areas. Other armed groups and militias have also been involved, and Eritrea has supported Ethiopia’s national army in the conflict. Although the joint investigation covered events occurring between November 3, 2020 and June 28, 2021, the findings provide significant context for this case and the later escalation in hostilities in November 2021, when the TPLF seized territory outside of Tigray.
3. Oversight Board authority and scope
The Board has authority to review decisions that Meta refers for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1). The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with non-binding recommendations to which Meta must respond (Charter Article 3, Section 4; Article 4).
4. Sources of authority
The Oversight Board considered the following sources of authority:
I.Oversights Board decisions:
The most relevant previous decisions of the Oversight Board include:
- “Alleged crimes in Raya Kobo” [ Case decision 2021-014-FB-UA]: The Board recommended that Facebook’s Community Standards should reflect that unverifiable rumors pose higher risk to the rights of life and security of persons in the contexts of war and violent conflict. The Board also recommended that Meta commission an independent human rights due diligence assessment on how the use of its platforms has heightened the risk of violence in Ethiopia.
- “Former President Trump’s Suspension” [ Case decision 2021-001-FB-FBR]: The Board recommended that Meta should develop and publish a policy for crisis situations. The Board also made a recommendation about the need to collect, preserve and, where appropriate, share information to assist in the investigation and potential prosecution of grave violations of international criminal, human rights and humanitarian law by competent authorities and accountability mechanisms.
- “Sudan graphic video case” [ Case decision 2022-002-FB-MR]: The Board reiterated its recommendation in the “Former President Trump’s Suspension” case for Meta to develop and publish a policy in response to crisis situations “where its regular processes would not prevent or avoid imminent harm.”
II.Meta’s content policies:
Facebook’s Community Standards:
Under its Violence and Incitement policy, Meta states that it will remove any content that “incites or facilitates serious violence.” The policy prohibits “threats that could lead to death (and other forms of high-severity violence) … targeting people or places.” It also prohibits “statements of intent to commit high-severity violence.”
III.Meta’s values:
Meta’s values are outlined in the introduction to Facebook’s Community Standards. The value of “Voice” is described as “paramount”:
The goal of our Community Standards has always been to create a place for expression and give people a voice. [We want] people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Meta limits “Voice” in service of four other values, two of which are relevant here:
“Safety”: We remove content that could contribute to a risk of harm to the physical security of persons.
“Dignity” : We expect that people will respect the dignity of others and not harass or degrade others.
IV.International human rights standards
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. Significantly, the UNGPs impose a heightened responsibility on businesses operating in a conflict setting (“Business, human rights and conflict-affected regions: towards heightened action,” A/75/212). The Board's analysis of Meta’s human rights responsibilities in this case was informed by the following human rights standards:
- The right to freedom of expression: Article 19, International Covenant on Civil and Political Rights (ICCPR) , General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression, reports: A/HRC/38/35 (2018) and A/73/348 (2018).
- The right to life: Article 6, ICCPR, General Comment No. 36, Human Rights Committee (2018).
- The right not to be subjected to torture or cruel, inhuman, or degrading punishment: Article 7, ICCPR.
- The right to security of person: Article 9, para. 1, ICCPR.
5. User submissions
Following Meta’s referral and the Board’s decision to accept the case, the user was sent a message notifying them of the Board’s review and providing them with an opportunity to submit a statement to the Board. The user did not submit a statement.
6. Meta’s submissions
In its referral of the case to the Board, Meta stated that the decision regarding the content was difficult because it involved removing “official government speech that could be considered newsworthy,” but may pose a risk of inciting violence during an ongoing conflict. Meta stated that it did not consider granting the newsworthiness allowance because that allowance does not apply to content that presents a risk of contributing to physical harm.
Since late 2020, Meta stated that it has treated Ethiopia as a Tier 1 at-risk country, the highest risk level. According to Meta, classifying countries as at-risk is part of its process for prioritizing investment in product resources over the long-term. For example, in response to the high risk in Ethiopia, Meta developed language classifiers (machine learning tools trained to automatically detect potential violations of the Community Standards) in Amharic and Oromo, two of the most widely used languages in Ethiopia. According to the company, the initial Amharic and Oromo classifiers were launched in October 2020. In June 2021, Meta launched what it refers to as the “Hostile Speech” classifiers in Amharic and Oromo (machine learning tools trained to identify content subject to Hate Speech, Violence and Incitement, and Bullying and Harassment policies). The company also created an IPOC for Ethiopia on November 4, 2021 in response to the escalation in the conflict. IPOCs typically operate for several days or weeks. An IPOC is convened either for planned events, such as certain elections, or in response to risky unplanned events. An IPOC can be requested by any Meta employee. The request is reviewed by a multi-level, multi-stakeholder group within the company that includes representatives from its Operations, Policy, and Product teams. There are different levels of IPOC, providing escalating levels of coordination and communication in monitoring content on the platform. The IPOC convened in November 2021 for Ethiopia was Level 3, which “involves the greatest level of coordination and communication within Meta.” As Meta explained, IPOCs are a “short-term solution” meant to “understand a large set of issues and how to address them across a crisis or high-risk situation. It is not intended to be a sustainable, long-term solution to dealing with a years-long conflict.”
Meta referred to the Board’s analysis in the “Alleged crimes in Raya Kobo” case in support of the proposition that resolving the tension between protecting freedom of expression and reducing the threat of sectarian conflict requires careful consideration of the specifics of the conflict. Meta also noted the documented atrocity crimes committed by all sides of the conflict. Meta told the Board that given the nature of the threat, the influential status of the speaker, and the rapidly escalating situation in Ethiopia at the time the content was posted, the value of "Safety" outweighed other considerations and would best be served by removing the post than to leave it on the platform, despite the potential value of the content to warn individuals in Ethiopia of future violence.
The Board asked Meta 20 questions. Meta answered 14 questions fully and six questions partially. The partial responses related to the company’s approach to content moderation in armed conflict situations, imposing account restrictions for violations of content policies, and the cross-check process.
7. Public comments
The Oversight Board received and considered seven public comments related to this case. One of the comments was submitted from Asia Pacific and Oceana, three from Europe, one from Sub-Saharan African and two from United States and Canada.
The submissions covered the following themes: the inconsistency of Meta’s approach in the context of different armed conflicts; the heightened risk accompanying credible threats of violence between parties during an armed conflict; the problems with Meta’s content moderation in Ethiopia and the role of social media in closed information environments; factual background to the conflict in Ethiopia, including the harm suffered by Tigrayan people and the role of hate speech against Tigrayans on Facebook in spreading violence; and the need to consider laws of armed conflict in devising policies for moderating speech during an armed conflict.
To read public comments submitted for this case, please click here.
In April 2022, as part of ongoing stakeholder engagement, the Board consulted representatives of advocacy organizations, academics, inter-governmental organizations and other experts on the issue of content moderation in the context of armed conflict. Discussions included the treatment of speech by parties to a conflict and the application of the Violence and Incitement policy in conflict situations.
8. Oversight Board analysis
The Board examined the question of whether this content should be restored, and the broader implications for Meta’s approach to content governance, through three lenses: Meta's content policies, the company's values and its human rights responsibilities.
8.1 Compliance with Meta’s content policies
The Board finds that removing the content from the platform is consistent with the Violence and Incitement Community Standard. The policy prohibits “threats that could lead to death (and other forms of high-severity violence) … targeting people or places,” including “statements of intent to commit high-severity violence.” The Board finds that the content can be reasonably interpreted by others as a call that could incite or encourage acts of actual violence in the already violent context of an armed conflict. As such, the content violates Meta’s prohibition on “statements of intent to commit high-severity violence.”
8.2 Compliance with Meta’s values
The Board concludes that removing this content from the platform is consistent with Meta’s values of “Safety” and “Dignity.”
The Board recognizes the importance of “Voice” especially in a country with a poor record of press and civic freedoms and where social media platforms serve as a key means of imparting information about the ongoing armed conflict. However, in the context of an armed conflict, marked by a history of sectarian violence and violations of international law, the values of “Safety” and “Dignity” prevail in this case to protect users from content that poses a heightened risk of violence. The content in this case can be interpreted as a call to kill "Abiy Ahmed’s group." It can further be interpreted as a warning of punishment to those who will not surrender to the TPLF, and as such poses a risk to the life and physical integrity of Ethiopian federal forces and political leaders. While the content was shared by the governing regional body, the post itself does not contain information with sufficiently strong public interest value to outweigh the risk of harm.
8.3 Compliance with Meta’s human rights responsibilities
The Board finds that removing the content in this case is consistent with Meta’s human rights responsibilities. During an armed conflict, the company also has a responsibility to establish a principled and transparent system for moderating content where there is a reasonable probability that the content would succeed in inciting violence. The Board notes the heightened risk of content directly contributing to harm during an armed conflict. The Board finds that Meta currently lacks a principled and transparent framework for content moderation in conflict zones.
Freedom of expression (Article 19 ICCPR)
Article 19 of the ICCPR provides broad protection for freedom of expression, including the right to seek and receive information about possible violence. However, the right may be restricted under certain specific conditions, which satisfy the three-part test of legality (clarity), legitimacy, and necessity and proportionality. Meta has committed to respect human rights under the UNGPs and to look to authorities such as the ICCPR when making content decisions, including in situations of armed conflict. The Rabat Plan of Action also provides useful guidance on this matter. As the UN Special Rapporteur on freedom of expression has stated, although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression” ( A/74/486, para. 41).
I. Legality (clarity and accessibility of the rules)
Any restriction on freedom of expression should be accessible and clear enough to provide guidance to users and content reviewers as to what content is permitted on the platform and what is not. Lack of clarity or precision can lead to inconsistent and arbitrary enforcement of the rules.
The Violence and Incitement policy prohibits “threats that could lead to death” and, in particular, “statements of intent to commit high-severity violence.” The Board finds that the applicable policy in this case is clear. However, in the course of deciding this case, the Board finds that Meta provides insufficient information on how it implements the Violence and Incitement policy in situations of armed conflict, what policy exceptions are available and how they are used, or any specialized enforcement processes the company uses for this kind of situation.
II.Legitimate aim
Restrictions on freedom of expression should pursue a legitimate aim, which includes the respect of the rights of others, and the protection of national security or public order. The Facebook Community Standard on Violence and Incitement exists to prevent offline harm that may be related to content on Facebook. As previously concluded by the Board in the “Alleged crimes in Raya Kobo” case decision, restrictions based on this policy serve the legitimate aim of protecting the rights to life and bodily integrity.
III.Necessity and proportionality
Necessity and proportionality require Meta to show that its restriction on speech was necessary to address the threat, in this case the threat to the rights of others, and that it was not overly broad (General Comment 34, para. 34). In making this assessment, the Board also considered the factors in the Rabat Plan of Action, on what constitutes incitement to violence ( The Rabat Plan of Action, OHCHR, A/HRC/22/17/Add.4,2013), while accounting for differences between the international law obligations of states and the human rights responsibilities of businesses.
In this case, the Board finds that removing this content from the platform was a necessary and proportionate restriction on freedom of expression under international human rights law. Using the Rabat Plan of Action’s six-part test to inform its analysis, the Board finds support for the removal of this post.
The context in Ethiopia; the status and intent of the speaker; the content of the speech as well as its reach; and the likelihood of offline harm all contribute to a heightened risk of offline violence. (1) Context: The content was posted in the context of an ongoing and escalating civil war. Since its beginning, the conflict has been marked by violations of international human rights and humanitarian law committed by all parties to the conflict. (2) Speaker: The speaker is a regional government ministry affiliated with one of the parties to the conflict with significant reach and influence, including the authority to direct the Tigrayan armed forces. (3) Intent: Given the language and context, there is at least an explicit call to kill soldiers who do not surrender; and it could be reasonably inferred that there is further intent to commit harm. (4) Content: The post can be read to advocate targeting combatants and political leaders, regardless of their participation in the hostilities. (5) Extent of dissemination: the content was posted on the public page of a body connected to one of the parties to the conflict with about 260,000 followers and remained on the platform for two days before being removed. (6) Likelihood and Imminence: The content was posted around the time that TPLF forces were advancing towards other parts of Ethiopia beyond Tigray, as well as the Prime Minister declaring a nationwide state of emergency and calling on civilians to take up arms and fight.
While the Board found that removing the content in this case was necessary and proportionate, it also became clear to the Board in reviewing the case that more transparency is needed to assess whether Meta’s measures are consistently proportionate throughout a conflict and across all armed conflict contexts. The company has long been aware of how its platforms have been used to spread hate speech and fuel ethnic violence. While Meta has taken positive steps to improve its moderation system in some conflicts (for instance, commissioning an independent assessment of bias in content moderation in the Israeli-Palestinian conflict in response to the Board’s recommendation), it has not done enough to evaluate its existing policies and processes and to develop a principled and transparent framework for content moderation in conflict zones. Some Board Members have expressed that Meta’s content moderation in conflict zones should also be informed by international humanitarian law.
In Ethiopia, Meta has outlined the steps it has taken to remove content that incites others to violence. The company refers to two general processes for countries at risk of or experiencing violent conflict, which were used in Ethiopia: the “at-risk countries” tiering system and IPOCs. Ethiopia has been designated as a tier 1 at-risk country (highest risk) since late 2020 and had a level 3 IPOC (the highest level) at the time the content was posted. Despite this, it was only two days later that the content was removed, notwithstanding the clear policy line it violated. The Board notes that two days, in the context of an armed conflict, is a considerable time span given the Rabat assessment outlined above. This also suggests the inadequacy of the at-risk tiering system and IPOCs as a solution to deal with events posing heightened human rights risks.
Meta does not provide enough public information on the general method or criteria used for the “at-risk countries” assessment and the product investments the company has made as a result, in Ethiopia and other conflict situations. Without this information, neither the Board nor the public can evaluate the effectiveness and fairness of these processes, whether the company’s product investments are equitable or whether they are implemented with similar speed and diligence across regions and conflict situations.
IPOCs are, in the words of Meta, “short-term solutions” and convened on an ad hoc basis. This suggests to the Board that there may be a need for the company to invest greater resources in a sustained internal mechanism that provides the expertise, capacity and coordination necessary to review and respond to content effectively for the entirety of a conflict. Such assessment should be informed by policy and country expertise.
Meta’s current approach to content moderation in conflict zones could lead to the appearance of inconsistency. There are currently some 27 armed conflicts in the world, according to the Council on Foreign Relations. In at least one conflict (Russia-Ukraine), Meta has, to some observers, appeared to promptly take action and create policy exceptions to allow content that would otherwise be prohibited under the Violence and Incitement policy, while taking too long to respond in other conflict situations. One public comment (PC-10433), submitted by Dr. Samson Esayas, associate professor at BI Norwegian Business School, noted Meta’s “swift measures” in moderating content in the context of the Russia-Ukraine conflict and highlighted the “differential treatment between this conflict and conflicts in other regions, particularly Ethiopia and Myanmar.” This suggests an inconsistent approach, which is problematic for a company of Meta’s reach and resources, especially in the context of armed conflict.
9. Oversight Board decision
The Oversight Board upholds Meta's decision to remove the content for violating the Violence and Incitement Community Standard.
10. Policy advisory statement
Transparency
1. In line with the Board’s recommendation in the “Former President Trump’s Suspension,” as reiterated in the “Sudan Graphic Video,” Meta should publish information on its Crisis Policy Protocol. The Board will consider this recommendation implemented when information on the Crisis Policy Protocol is available in the Transparency Center, within six months of this decision being published, as a separate policy in the Transparency Center in addition to the Public Policy Forum slide deck.
Enforcement
2. To improve enforcement of its content policies during periods of armed conflict, Meta should assess the feasibility of establishing a sustained internal mechanism that provides the expertise, capacity and coordination required to review and respond to content effectively for the duration of a conflict. The Board will consider this recommendation implemented when Meta provides an overview of the feasibility of a sustained internal mechanism to the Board.
* Procedural note:
The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.