Overturned
Shared Al Jazeera post
September 14, 2021
The Oversight Board agrees that Facebook was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas.
Please note that this decision is available in both Arabic (via the ‘language’ tab accessed through the menu at the top of this screen) and Hebrew (via this link).
לקריאת ההחלטה במלואה יש ללחוץ כאן.
Case summary
The Oversight Board agrees that Facebook was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas. Facebook originally removed the content under the Dangerous Individuals and Organizations Community Standard, and restored it after the Board selected this case for review. The Board concludes that removing the content did not reduce offline harm and restricted freedom of expression on an issue of public interest.
About the case
On May 10, 2021, a Facebook user in Egypt with more than 15,000 followers shared a post by the verified Al Jazeera Arabic page consisting of text in Arabic and a photo.
The photo portrays two men in camouflage fatigues with faces covered, wearing headbands with the insignia of the Al-Qassam Brigades. The text states "The resistance leadership in the common room gives the occupation a respite until 18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman." The user shared Al Jazeera’s post and added a single-word caption “Ooh” in Arabic. The Al-Qassam Brigades and their spokesperson Abu Ubaida are both designated as dangerous under Facebook’s Dangerous Organizations and Individuals Community Standard.
Facebook removed the content for violating this policy, and the user appealed the case to the Board. As a result of the Board selecting this case, Facebook concluded it had removed the content in error and restored it.
Key findings
After the Board selected this case, Facebook found that the content did not violate its rules on Dangerous Individuals and Organizations, as it did not contain praise, support or representation of the Al-Qassam Brigades or Hamas. Facebook was unable to explain why two human reviewers originally judged the content to violate this policy, noting that moderators are not required to record their reasoning for individual content decisions.
The Board notes that the content consists of republication of a news item from a legitimate news outlet on a matter of urgent public concern. The original Al Jazeera post it shared was never removed and the Al-Qassam Brigades’ threat of violence was widely reported elsewhere. In general, individuals have as much right to repost news stories as media organizations have to publish them in the first place.
The user in this case explained that their purpose was to update their followers on a matter of current importance, and their addition of the expression “Ooh” appears to be neutral. As such, the Board finds that removing the user’s content did not materially reduce offline harm.
Reacting to allegations that Facebook has censored Palestinian content due to Israeli government demands, the Board asked Facebook questions including whether the company had received official and unofficial requests from Israel to remove content related to the April-May conflict. Facebook responded that it had not received a valid legal request from a government authority related to the user’s content in this case, but declined to provide the remaining information requested by the Board.
Public comments submitted for this case included allegations that Facebook has disproportionately removed or demoted content from Palestinian users and content in Arabic, especially in comparison to its treatment of posts threatening anti-Arab or anti-Palestinian violence within Israel. At the same time, Facebook has been criticized for not doing enough to remove content that incites violence against Israeli civilians. The Board recommends an independent review of these important issues, as well as greater transparency with regard to its treatment of government requests.
The Oversight Board’s decision
The Oversight Board affirms Facebook’s decision to restore the content, noting that its original decision to remove the content was not warranted.
In a policy advisory statement, the Board recommends that Facebook:
- Add criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation and news reporting.
- Ensure swift translation of updates to the Community Standards into all available languages.
- Engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias. This examination should review not only the treatment of Palestinian or pro-Palestinian content, but also content that incites violence against any potential targets, no matter their nationality, ethnicity, religion or belief, or political opinion. The review should look at content posted by Facebook users located in and outside of Israel and the Palestinian Occupied Territories. The report and its conclusions should be made public.
- Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting. The transparency reporting should distinguish government requests that led to removals for violations of the Community Standards from requests that led to removal or geo-blocking for violating local law, in addition to requests that led to no action.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
1. Decision summary
The Oversight Board agrees that Facebook was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas, made on May 10, 2021. The Al-Qassam Brigades are designated as a terrorist organization by many states, either as part of Hamas or on their own account. After the user appealed and the Board selected the case for review, Facebook concluded that the content was removed in error and restored the post to the platform.
The Dangerous Individuals and Organizations policy states that sharing the official communications of a dangerous organization designated by Facebook is a form of substantive support. The policy, however, includes news reporting and neutral discussion exceptions. The company applied the news reporting exception to Al Jazeera’s post and erroneously failed to apply the neutral discussion exception, which it later corrected. The Board concludes that the removing of the content in this case was not necessary as it did not reduce offline harm and instead resulted in an unjustified restriction on freedom of expression on a public interest issue.
2. Case description
On May 10, a Facebook user in Egypt (the user) with more than 15,000 followers shared a post by the verified Al Jazeera Arabic page consisting of text in Arabic and a photo. The photo portrays two men in camouflage fatigues with faces covered, wearing headbands with the insignia of the Al-Qassam Brigades, a Palestinian armed group and the militant wing of Hamas. The Board notes that Al-Qassam Brigades have been accused of committing war crimes (Report of the UN Independent Commission of Inquiry of the 2014 Gaza Conflict, A/HRC/29/CRP.4, and Human Rights Watch, Gaza: Apparent War Crimes During May Fighting (2021)).
The text in the photo states: "The resistance leadership in the common room gives the occupation a respite until 18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood in Jerusalem, otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman." Al Jazeera’s caption read: "'He Who Warns is Excused'. Al-Qassam Brigades military spokesman threatens the occupation forces if they do not withdraw from Al-Aqsa Mosque." The user shared the Al Jazeera post and added a single-word caption “Ooh” in Arabic. The Al-Qassam Brigades and their spokesperson Abu Ubaida are both designated as dangerous under Facebook’s Dangerous Organizations and Individuals Community Standard.
On the same day, a different user in Egypt reported the post, selecting “terrorism” from the fixed list of reasons Facebook gives people who report content. The content was assessed by an Arabic speaking moderator in North Africa who removed the post for violating the Dangerous Individuals and Organizations policy. The user appealed and the content was reviewed by a different reviewer in Southeast Asia who did not speak Arabic but had access to an automated translation of the content. Facebook explained that this was due to a routing error that it is working on resolving. The second reviewer also found a breach of the Dangerous Individuals and Organizations policy and the user received a notification explaining that the initial decision was upheld by a second review. Due to the violation, the user received a three-day read-only restriction on their account. Facebook also restricted the user’s ability to broadcast livestreamed content and use advertising products on the platform for 30 days.
The user then appealed to the Oversight Board. As a consequence of the Board selecting the case for review, Facebook determined that the content was removed in error and restored it. Facebook later confirmed to the Board that the original Al Jazeera post remained on the platform and had never been taken down.
The content in this case relates to the May 2021 armed conflict between Israeli forces and Palestinian militant groups in Israel and Gaza, a Palestinian territory governed by Hamas. The conflict broke out after weeks of rising tensions and protests in Jerusalem tied to a dispute over ownership of homes in the Sheikh Jarrah neighborhood of East Jerusalem and to an Israeli Supreme Court ruling concerning the planned expulsion of four Palestinian families from the disputed properties. These tensions had escalated into a series of sectarian assaults by both Arab and Jewish mobs. On May 10, Israeli forces raided the Al-Aqsa Mosque, injuring hundreds of worshippers during Ramadan prayers (Communication from UN Independent Experts to the Government of Israel, UA ISR 3.2021). After this raid the Al-Qassam Brigades issued an ultimatum, demanding that Israeli soldiers withdraw from both the Mosque and Sheikh Jarrah by 6pm. After the deadline expired, Al-Qassam and other Palestinian militant groups in Gaza launched rockets at the civilian center of Jerusalem, which began 11 days of armed conflict.
3. Authority and scope
The Board has the power to review Facebook's decision following an appeal from the user whose post was removed (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). The Board may uphold or reverse that decision (Charter Article 3, Section 5). In line with case decision 2020-004-IG-UA, Facebook reversing a decision that a user appealed to the Board does not exclude the case from review.
The Board's decisions are binding and may include policy advisory statements with recommendations. These recommendations are non-binding, but Facebook must respond to them (Charter Article 3, Section 4).
4. Relevant standards
The Oversight Board considered the following standards in its decision:
I. Facebook’s Community Standards:
The Community Standard on Dangerous Individuals and Organizations states that Facebook does “not allow organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.” Facebook carries out its own process of designating entities as dangerous under this policy, with its designations often based on national terrorist lists.
On June 22, Facebook updated the policy to divide these designations into three tiers. The update explains that the three tiers “indicate the level of content enforcement, with Tier 1 resulting in the most extensive enforcement because we believe these entities have the most direct ties to offline harm.” Tier 1 designations are focused on “entities that engage in serious offline harms” such as terrorist groups and result in the highest level of content enforcement. Facebook removes praise, substantive support, and representation of Tier 1 entities as well as their leaders, founders, or prominent members.
II. Facebook values:
The value of "Voice" is described as "paramount":
The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Facebook limits "Voice" in the service of four values. “Safety” is the most relevant in this case:
We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.
III. Human Rights Standards:
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In March 2021, Facebook announced its Corporate Human Rights Policy, where it recommitted to respecting human rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:
- The right to freedom of opinion and expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); Human Rights Committee, General Comment No. 34, (2011); UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019); Human Rights Council, Resolution on the Safety of Journalists, A/HRC/RES/45/18 (2020); UN Special Rapporteur on the situation of human rights in the Palestinian territories occupied since 1967, A/75/532 (2020);
- The right to non-discrimination: ICCPR Articles 2 and 26;
- The right to life: ICCPR Article 6; Human Rights Committee, General Comment No. 36, (2018);
- The right to security of person: ICCPR Article 9, as interpreted by General Comment No. 35, para. 9, Human Rights Committee (2014).
5. User statement
In their appeal to the Board, the user explained that they shared the Al Jazeera post to update their followers on the developing crisis and that it was an important issue that more people should be aware of. The user stressed that their post simply shared content from an Al Jazeera page and that their caption was simply “ooh.”
6. Explanation of Facebook’s decision
In response to Board inquiry, Facebook stated that it was unable to explain why the two human reviewers judged the content to violate the Dangerous Individuals and Organizations policy, noting that moderators are not required to record their reasoning for individual content decisions. The company clarified that “in this case, the content reviewers had access to the entire piece of content, which includes the caption and image of the original root post and the additional caption the content creator placed on the shared version of the post.” The company added that “generally, content reviewers are trained to look at the entire piece of content.”
As a consequence of the Board selecting this case for review, Facebook reexamined its decision and found that the content did not contain praise, substantive support, or representation of the Al-Qassam Brigades or Hamas, their activities, or their members. Facebook explained that it reversed its decision since Al Jazeera’s post was non-violating and the user shared it using a neutral caption. According to the Dangerous Individuals and Organizations policy, channeling information or resources, including official communications, on behalf of a designated entity or event is a form of prohibited substantive support for a dangerous organization and entity. However, the policy specifically provides an exception for content published as part of news reporting, though it does not define what constitutes news reporting. The policy also provides a neutral discussion exception. The original Al Jazeera post appeared, and still appears, on the Al Jazeera Arabic Facebook page. It was never removed by Facebook.
Facebook explained that Al Jazeera’s page is subject to the cross-check system, an additional layer of review which Facebook applies to some high profile accounts to minimize the risk of errors in enforcement. However, cross-checking is not performed on content that is shared by a third party, unless that third party is also a high profile account subject to cross-check. Thus, in this case, although the root post by Al Jazeera was subject to cross-checking, the post by the user in Egypt was not.
The company stated that its restoration of the post is consistent with its responsibility to respect the right to seek, receive, and impart information. Facebook concluded that the user’s caption was neutral and did not fit within the definitions of praise, substantive support, or representation.
Reacting to allegations that Facebook has censored Palestinian content due to the Israeli government’s demands, the Board asked Facebook:
Has Facebook received official and unofficial requests from Israel to take down content related to the April-May conflict? How many requests has Facebook received? How many has it complied with? Did any requests concern information posted by Al Jazeera Arabic or its journalists?
Facebook responded by saying: "Facebook has not received a valid legal request from a government authority related to the content the user posted in this case. Facebook declines to provide the remaining requested information. See Oversight Board Bylaws, Section 2.2.2."
Under the Oversight Board Bylaws, Section 2.2.2, Facebook may “decline such requests where Facebook determines that the information is not reasonably required for decision-making in accordance with the intent of the charter, is not technically feasible to provide, is covered by attorney/ client privilege, and/or cannot or should not be provided because of legal, privacy, safety, or data protection restrictions or concerns.” The company did not indicate the specific reasons for the refusal under the Bylaws.
7. Third-party submissions
The Oversight Board received 26 public comments related to this case. 15 were from the United States and Canada, seven from Europe, three from the Middle East and North Africa, and one from Latin America and the Caribbean.
The submissions covered themes including the importance of social media to Palestinians, concerns about Facebook’s potential bias against and over-moderation of Palestinian or pro-Palestinian content, concerns about the alleged opaque relationship between Israel and Facebook, and concerns that messages from designated terrorist organization were allowed on the platform.
Additionally, the Board received several public comments arguing that the reporting of such threats may also warn of attacks by an armed groups thus allowing those targeted to take measures to protect themselves.
To read public comments submitted for this case, please click here.
8. Oversight Board analysis
8.1 Compliance with Community Standards
The Board concludes that Facebook’s original decision to remove the content was not in line with the company’s Community Standards. Facebook’s reversal of the decision following the Board’s identification of this case therefore was correct.
According to Facebook, Al Jazeera’s root post, which is not the subject of this appeal, did not violate the Community Standards and was never removed from the platform. While sharing official communications from a designated entity is a prohibited form of substantive support, the Dangerous Individuals and Organizations policy allows such content to be posted for condemnation, neutral discussion, or news reporting purposes. Facebook updated the Dangerous Individuals and Organizations policy on June 22 2021, making public previously confidential definitions of “substantive support,” “praise,” and “representation.”
The user shared Al Jazeera’s post with a single-word caption “ooh.” Facebook concluded that the term was a neutral form of expression. Arabic language experts consulted by the Board explained that the meaning of “ooh” varies depending on its usage, with neutral exclamation being one interpretation.
In the updated Dangerous Individuals and Organizations policy, Facebook requires “people to clearly indicate their intent” to neutrally discuss dangerous individuals or organizations, and “if the intention is unclear, we may remove content” (emphasis added). This is a change from a previous policy indicated in Facebook’s responses to the Board’s question in case decision 2020-005-FB-UA. There, Facebook explained that it “treated content that quotes, or attributes quotes (regardless of their accuracy), to a designated dangerous individual as an expression of support for that individual unless the user provides additional context to make their intent explicit” (emphasis added). It follows that prior to the June 22 policy update of the public facing community standards, content moderators had less discretion to retain content with unclear intent while users were not aware of the importance of making their intent clear.
It is understandable why content moderators, operating under time pressure, might treat the post as violating, especially under the version of the Community Standard in effect at the time. The post conveys a direct threat from a spokesman for a designated dangerous organization, and the user’s addition of the expression “ooh” did not make “explicit” the user’s intent to engage in neutral discussion. However, the salient fact is that this was a republication of a news item from a legitimate news outlet on a matter of urgent public concern. The root post, from Al Jazeera, has never been found to be violating and has remained on its page throughout. In general, individuals have no less right to repost news than news media organizations have to publish it in the first place. Although in some contexts the republication of material from a news source might be violating, in this case the user has explained that their purpose was to update their followers on a matter of current importance, and Facebook’s conclusion (on reexamination) that the user’s addition of the expression “ooh” was most likely neutral is confirmed by the Board’s language experts.
Under the new version of the relevant Community Standard, announced on June 22, the post was not clearly violating, and Facebook did not err in restoring it.
8.2 Compliance with Facebook’s values
The Board concludes that the decision to restore this content complies with Facebook’s value of “Voice,” and is not inconsistent with its value of “Safety.” The Board is aware that Facebook’s values play a role in the company’s development of policies and are not used by moderators to decide whether content is permissible.
Facebook states that the value of “Voice” is “paramount.” In the Board's view, this is especially true in the context of a conflict where the ability of many people, including Palestinians and their supporters, to express themselves is highly restricted. As numerous public comments submitted to the Board stress, Facebook and other social media are the primary means that Palestinians have to communicate news and opinion, and to express themselves freely. There are severe limitations on the freedom of expression in territories governed by the Palestinian Authority and Hamas (A/75/532, para. 25). Additionally, the Israeli government has been accused of unduly restricting expression in the name of national security (Working Group on the Universal Periodic Review, A/HRC/WG.6/29/ISR/2, para. 36-37; Oxford Handbook on the Israeli Constitution, Freedom of Expression in Israel: Origins, Evolution, Revolution and Regression, (2021)). Furthermore, for people in the region more broadly, the ability to receive and share news about these events is a crucial aspect of “Voice.”
The Board only selects a limited number of appeals to review, but notes the removal in this case was among several appeals that concerned content relating to the conflict.
On the other hand, the value of “Safety” is also a vital concern in Israel and the Occupied Palestinian Territories, and other countries in the region. The user shared a post from a media organization that contained an explicit threat of violence from the Al-Qassam Brigades, implicating the value of “Safety.” However, the content that the user shared was broadly available around the world on and off Facebook. The root post, a news media report of the threat, was not removed from Facebook, and it still remains on Al Jazeera’s page. It also was widely reported elsewhere. The Board finds that sharing of the post did not pose any additional threat to the value of “Safety.”
8.3 Compliance with Facebook’s human rights responsibilities
Freedom of expression (Article 19 ICCPR)
Article 19 of the ICCPR states that everyone has the right to freedom of expression, which includes freedom to seek, receive and impart information. The enjoyment of this right is intrinsically tied to access to free, uncensored and unhindered press or other media (General Comment 34, para. 13). The Board agrees that the media “plays a crucial role in informing the public about acts of terrorism and its capacity to operate should not be unduly restricted” (General Comment 34, para. 46). The Board is also aware that terrorist groups may exploit the media’s duty and interest to report on their activities.
However, counter-terrorism and counter-extremism efforts should not be used to repress media freedom (A/HRC/RES/45/18). Indeed, the media has an essential role to play during the first moments of a terrorist act, as it is “often the first source of information for citizens, well before the public authorities are able to take up the communication” (UNESCO, Handbook on Terrorism and the Media, p. 27, 2017). Social media contributes to this mission by supporting the dissemination of information about threats of or terrorism acts published in traditional media and non-media sources.
While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR).
I. Legality (clarity and accessibility of the rules)
To meet the test of legality, a rule must be (a) formulated with sufficient precision so that individuals can regulate their conduct accordingly, and (b) made accessible to the public. A lack of specificity can lead to subjective interpretation of rules and their arbitrary enforcement (General Comment No. 34, para. 25).
The Board has criticized the vagueness of the Dangerous Individuals and Organizations Community Standard in several cases and called on the company to define praise, support and representation. Facebook has since revised the policy, releasing an update on June 22. It defined or gave examples of some key terms in the policy, organized its rules around three tiers of enforcement according to the connection between a designated entity and offline harm, and further stressed the importance of users making their intent clear when posting content related to dangerous individuals or organizations. The policy, however, remains unclear on how users can make their intentions clear and does not provide examples of the ‘news reporting,’ neutral discussion,’ and ‘condemnation’ exceptions.
Moreover, the updated policy seemingly increases Facebook’s discretion in cases where the user’s intent is unclear, now providing that Facebook “may” remove the content without offering any guidance to users about the criteria that will inform the use of that discretion. The Board believes that criteria for assessing these exceptions, including illustrative examples, would help users understand what posts are permissible. Additional examples will also give clearer guidance to reviewers.
In addition, the Board is concerned that this revision to the Community Standards was not translated into languages other than US English for close to two months, thus limiting access to the rules to users outside the US English market. Facebook explained that it applies changes to policies globally, even when translations are delayed. The Board is concerned that these translation delays leave the rules inaccessible for too many users for too long. This is not acceptable for a company with Facebook's resources.
II. Legitimate aim
Restrictions on freedom of expression must pursue a legitimate aim, which include the protection of national security and public order, and the rights of others, among additional aims. The Dangerous Individuals and Organizations policy seeks to prevent and disrupt real world harm with the legitimate aim of protecting the rights of others, which in this case includes the right to life and the security of persons.
III. Necessity and proportionality
Restrictions must be necessary and proportionate to achieve their legitimate aim, in this case protecting the rights of others. Any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment 34, para. 34). In resolving questions of necessity and proportionality, context plays a key role.
The Board further stresses that Facebook has a responsibility to identify, prevent, mitigate and account for adverse human rights impacts (UNGPs, Principle 17). This due diligence responsibility is heightened in conflict-affected regions (A/75/212, para. 13). The Board notes that Facebook has taken some steps to ensure that content is not removed unnecessarily and disproportionately, as illustrated by the news reporting exception and its commitment to allow for the discussion of human rights concerns discussed in case decision 2021-006-IG-UA.
The Board concludes that removal of the content in this case was not necessary. The Board recognizes that journalists face a challenge in balancing the potential harm of reporting on the statements of a terrorist organization, and keeping the public informed on evolving and dangerous situations. Some Board members expressed concern that the reporting in this instance provided little or no editorial context for Al-Qassam’s statements, and thus could be seen as a conduit for Al-Qassam’s threat of violence. However, the content posted by Al Jazeera was also widely reported by other outlets and was widely available globally, which was accompanied by further context as developments became available. The Board thus concludes that removal of this user’s republication of the Al Jazeera report did not materially reduce the terroristic impact the group presumably intended to induce, but instead affected the ability of this user, in a nearby country, to communicate the importance of these events to their readers and followers.
As already noted in connection with the value of “Voice,” in reviewing the necessity of the removal, the Board considers significant the broader media and information environment in this region. The Israeli government, the Palestinian Authority, and Hamas unduly restrict free speech, which negatively impacts Palestinian and other voices.
Restrictions on freedom of expression must be non-discriminatory, including on the basis of nationality, ethnicity, religion or belief, or political or other opinion (Article 2, para. 1, and Article 26, ICCPR). Discriminatory enforcement of the Community Standards violates this fundamental aspect of freedom of expression. The Board has received public comments and reviewed publicly available information alleging that Facebook has disproportionately removed or demoted content from Palestinian users and content in the Arabic language, especially in comparison to its treatment of posts threatening or inciting anti-Arab or anti-Palestinian violence within Israel. At the same time, Facebook has been criticized for not doing enough to remove content that incites violence against Israeli civilians. Below, the Board recommends an independent review of these important issues.
9. Oversight Board decision
The Oversight Board affirms Facebook's decision to restore the content, agreeing that the original decision to take down the post was in error.
10. Policy advisory statement
Content Policy
To clarify its rules to users, Facebook should:
1. Add criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation and news reporting.
2. Ensure swift translation of updates to the Community Standards into all available languages.
Transparency
To address public concerns regarding potential bias in content moderation, including in respect of actual or perceived government involvement, Facebook should:
3. Engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias. This examination should review not only the treatment of Palestinian or pro-Palestinian content, but also content that incites violence against any potential targets, no matter their nationality, ethnicity, religion or belief, or political opinion. The review should look at content posted by Facebook users located in and outside of Israel and the Palestinian Occupied Territories. The report and its conclusions should be made public.
4. Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting. The transparency reporting should distinguish government requests that led to removals for violations of the Community Standards from requests that led to removal or geo-blocking for violating local law, in addition to requests that led to no action.
*Procedural note:
The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.