Multiple Case Decision
Posts That Include “From the River to the Sea”
September 4, 2024
In reviewing three cases involving different pieces of Facebook content containing the phrase “From the River to the Sea,” the Board finds they did not break Meta’s rules on Hate Speech, Violence and Incitement or Dangerous Organizations and Individuals.
3 cases included in this bundle
FB-TDOKI4L8
Case about on Facebook
FB-0H634H19
Case about on Facebook
FB-OMEHM1ZR
Case about violent and graphic content on Facebook
This decision is available in Arabic and Hebrew.
Click here for decision in Hebrew.
For Arabic, navigate to top right of the page and click the globe for the language menu.
Summary
In reviewing three cases involving different pieces of Facebook content containing the phrase “From the River to the Sea,” the Board finds they did not break Meta’s rules on Hate Speech, Violence and Incitement or Dangerous Organizations and Individuals. Specifically, the three pieces of content contain contextual signs of solidarity with Palestinians – but no language calling for violence or exclusion. They also do not glorify or even refer to Hamas, an organization designated as dangerous by Meta. In upholding Meta’s decisions to keep up the content, the majority of the Board notes the phrase has multiple meanings and is used by people in various ways and with different intentions. A minority, however, believes that because the phrase appears in the 2017 Hamas charter and given the October 7 attacks, its use in a post should be presumed to constitute glorification of a designated entity, unless there are clear signals to the contrary.
These three cases highlight tensions between Meta’s value of voice and the need to protect freedom of expression, particularly political speech during conflict, and Meta’s values of safety and dignity to protect people against intimidation, exclusion and violence. The current and ongoing conflict that followed the Hamas terrorist attack in October 2023 and Israel’s subsequent military operations has led to protests globally and accusations against both sides for violating international law. Equally relevant is the surge in antisemitism and Islamophobia not only to these cases but also general use of “From the River to the Sea” on Meta’s platforms. These cases have again underscored the importance of data access to effectively assess Meta’s content moderation during conflicts, as well as the need for a method to track the amount of content attacking people based on a protected characteristic. The Board’s recommendations urge Meta to ensure its new Content Library is an effective replacement for CrowdTangle and to fully implement a recommendation from the BSR Human Rights Due Diligence Report of Meta’s Impacts in Israel and Palestine.
About the Cases
In the first case, a Facebook user commented on a video posted by a different user. The video’s caption encourages others to “speak up” and includes hashtags such as “#ceasefire” and “#freepalestine.” The user’s comment includes the phrase “FromTheRiverToTheSea” in hashtag form, additional hashtags such as “#DefundIsrael” and heart emojis in the colors of the Palestinian flag. Viewed about 3,000 times, the comment was reported by four users but these reports were automatically closed because Meta’s automated systems did not prioritize them for human review.
The Facebook user in the second case posted what is likely to be a generated image of floating watermelon slices that form the words from the phrase, alongside “Palestine will be free.” Viewed about 8 million times, this post was reported by 937 users. Some of these reports were assessed by human moderators who found the post did not break Meta’s rules.
For the third case, an administrator of a Facebook page reshared a post by a Canadian community organization, in which the founding members declared support for the Palestinian people, condemned their “senseless slaughter” and “Zionist Israeli occupiers.” With less than 1,000 views, this post was reported by one user but the report was automatically closed.
In all three cases, users then appealed to Meta to remove the content but the appeals were closed without human review following an assessment by one of the company’s automated tools. After Meta upheld its decisions to keep the content on Facebook, the users appealed to the Board.
Unprecedented terrorist attacks by Hamas on Israel in October 2023, which killed 1,200 people and involved 240 hostages being taken, have been followed by a large-scale military response by Israel in Gaza, killing over 39,000 people (as of July 2024). Both sides have since been accused of violating international law, and committing war crimes and crimes against humanity. This has generated worldwide debate, much of which has taken place on social media, including Facebook, Instagram and Threads.
Key Findings
The Board finds there is no indication that the comment or the two posts broke Meta’s Hate Speech rules because they do not attack Jewish or Israeli people with calls for violence or exclusion, nor do they attack a concept or institution associated with a protected characteristic that could lead to imminent violence. Instead, the three pieces of content contain contextual signals of solidarity with Palestinians, in the hashtags, visual representation or statements of support. On other policies, they do not break the Violence and Incitement rules nor do they violate Meta’s Dangerous Organizations and Individuals policy as they do not contain threats of violence or other physical harm, nor do they glorify Hamas or its actions.
In coming to its decision, the majority of the Board notes that the phrase “From the River to the Sea” has multiple meanings. While it can be understood by some as encouraging and legitimizing antisemitism and the violent elimination of Israel and its people, it is also often used as a political call for solidarity, equal rights and self-determination of the Palestinian people, and to end the war in Gaza. Given this fact, and as these cases show, the standalone phrase cannot be understood as a call to violence against a group based on their protected characteristics, as advocating for the exclusion of a particular group, or of supporting a designated entity – Hamas. The phrase’s use by this terrorist group with explicit violent eliminationist intent and actions, does not make the phrase inherently hateful or violent – considering the variety of people using the phrase in different ways. It is vital that factors such as context and identification of specific risks are assessed to analyze content posted on Meta’s platforms as a whole. Though removing content could have aligned with Meta’s human rights responsibilities if the phrase had been accompanied by statements or signals calling for exclusion or violence, or legitimizing hate, such removal would not be based on the phrase itself, but rather on other violating elements, in the view of the majority of the Board. Because the phrase does not have a single meaning, a blanket ban on content that includes the phrase, a default rule towards removal of such content, or even using it as a signal to trigger enforcement or review, would hinder protected political speech in unacceptable ways.
In contrast, a minority of the Board finds that Meta should adopt a default rule presuming the phrase constitutes glorification of a designated organization, unless there are clear signals the user does not endorse Hamas or the October 7 attacks.
One piece of research commissioned by the Board for these cases relied on the CrowdTangle data analysis tool. Access to platform data is essential for the Board and other external stakeholders to assess the necessity and proportionality of Meta’s content moderation decisions during armed conflicts. This is why the Board is concerned with Meta’s decision to shut down the tool while there are questions over the newer Meta Content Library as an adequate replacement.
Finally, the Board recognizes that even with research tools, there is limited ability to effectively assess the extent of the surge in antisemitic, Islamophobic, and racist and hateful content on Meta’s platforms. The Board urges Meta to fully implement a recommendation previously issued by the BSR Human Rights Due Diligence report to address this.
The Oversight Board’s Decision
The Oversight Board upholds Meta’s decisions to leave up the content in all three cases.
The Board recommends that Meta:
- Ensure that qualified researchers, civil society organizations and journalists, who previously had access to CrowdTangle, are onboarded to the new Meta Content Library within three week of submitting their application.
- Ensure its Content Library is a suitable replacement for CrowdTangle, providing equal or greater functionality and data access.
- Implement recommendation no. 16 from the BSR Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine report to develop a mechanism to track the prevalence of content attacking people based on specific protected characteristics (for example, antisemitic, Islamophobic and homophobic content).
* Case summaries provide an overview of cases and do not have precedential value.
Full Case Decision
1. Case Description and Background
The Oversight Board reviewed three cases together involving content posted on Facebook by different users in November 2023, following the Hamas terrorist attacks of October 7 and after Israel had started a military campaign in Gaza in response. The three pieces of content, all in English, each contain the phrase “From the River to the Sea.”
In the first case, a Facebook user commented on another user’s video. The video has a caption encouraging others to “speak up” and several hashtags including “#ceasefire” and “#freepalestine.” The comment contains the phrase “FromTheRiverToTheSea” in hashtag form, as well as additional hashtags including “#DefundIsrael” and heart emojis in the colors of the Palestinian flag. The user who created the content is not a public figure and they have fewer than 500 friends and no followers. The comment had about 3,000 views and was reported seven times by four users. The reports were closed after Meta’s automated systems did not prioritize them for human review within 48 hours. One of the users who reported the content then appealed to Meta.
In the second case, a Facebook user posted what appears to be a generated image of floating watermelon slices that form the words “From the River to the Sea,” along with “Palestine will be free.” The user who created the content is not a public figure and they have fewer than 500 friends and no followers. The post had about 8 million views and was reported 951 times by 937 users. The first report was closed, again because Meta’s automated systems did not prioritize it for human review within 48 hours. Some of the other reports were reviewed and assessed by human moderators who decided the content was non-violating. Several users who reported the content then appealed to Meta.
In the third case, the administrator of a Facebook page reshared a post from the page of a community organization in Canada. The post is a statement from the organization’s “founding members” who declare support for “the Palestinian people,” condemn their “senseless slaughter” by the “Zionist State of Israel” and “Zionist Israeli occupiers,” and express their solidarity with “Palestinian Muslims, Palestinian Christians and anti-Zionist Palestinian Jews.” The post ends with the phrase “From The River To The Sea.” This post had fewer than 1,000 views and was reported by one user. The report was automatically closed. The user who reported the content then appealed to Meta.
All the appeals Meta received regarding the three pieces of content were closed without human review, based on an assessment by automated tools. Meta upheld its decisions to keep the three pieces of content on the platform. The users who reported the content then appealed to the Board to have the content taken down. After the Board selected and announced these cases, the user who posted the content in the third case deleted the post from Facebook.
The Board notes the following context in reaching its decision.
On October 7, 2023, Hamas, a designated Tier 1 organization under Meta’s Dangerous Organizations and Individuals Community Standard, led unprecedented terrorist attacks on Israel from Gaza that killed an estimated 1,200 people and resulted in roughly 240 people being taken hostage, mostly Jewish and several Muslim Israeli citizens, as well as dual citizens and foreign nationals ( Ministry of Foreign Affairs, Government of Israel). More than 115 of those hostages continue to be held in captivity as of July 2024. The attacks included the burning and destruction of hundreds of homes and led to the immediate and ongoing displacement of about 120,000 people. Israel immediately undertook a large-scale military campaign in Gaza in response to the attacks. Israel’s military action, which is ongoing, has killed over 39,000 people ( The UN Office for the Coordination of Humanitarian Affairs, drawing on data from the Ministry of Health in Gaza). Reports indicate that, as of July 2024, 52% of the fatalities are estimated to be women and children. The military campaign has caused extensive destruction of civilian infrastructure and the repeated displacement of 1.9 million people, the overwhelming majority of Gaza’s population, who are now facing an acute humanitarian crisis. As of April 2024, at least 224 humanitarian personnel have been killed in Gaza, “more than three times as many humanitarian aid workers killed in any single conflict recorded in a single year.”
Meta immediately designated the events of October 7 a terrorist attack under its Dangerous Organizations and Individuals policy. Under its Community Standards, this means that Meta would remove any content on its platforms that “glorifies, supports or represents” the October 7 attacks or its perpetrators.
During the ongoing conflict, both sides have been accused of violating international law. Israel is facing proceedings for alleged violations of its obligations under the Convention on the Prevention and Punishment of the Crime of Genocide at the International Court of Justice. Moreover, Hamas and Israeli officials have each been named by the prosecutor of the International Criminal Court in applications for arrest warrants based on charges of war crimes and crimes against humanity alleged to have been committed by each party. Hamas officials are accused of bearing criminal responsibility for extermination; murder; taking hostages; rape and other acts of sexual violence; torture; other inhumane acts; cruel treatment; and outrages upon personal dignity in the context of captivity, on Israel and the Palestinian Territories (in the Gaza strip) from at least 7 October 2023. According to the prosecutor, these “were part of a widespread and systematic attack against the civilian population of Israel by Hamas and other armed groups pursuant to organisational policies,” some of which “continue to this day.” Israeli officials are accused of bearing criminal responsibility for starvation of civilians as a method of warfare; willfully causing great suffering, or serious injury to body or health; willful killing or murder; intentionally directing attacks against a civilian population; extermination and/or murder, including in the context of deaths caused by starvation; persecution; and other inhumane acts, on the Palestinian Territories (in the Gaza strip) from at least 8 October 2023. According to the prosecutor, these “were committed as part of a widespread and systematic attack against the Palestinian civilian population pursuant to State policy,” which “continue to this day.” Furthermore, in a July 19, 2024 Advisory Opinion, issued in response to a request by the UN General Assembly, the International Court of Justice concluded that “the State of Israel’s continued presence in the Occupied Palestinian Territory is unlawful” and stated the obligations for Israel, other States and international organizations, including the United Nations, on the basis of this finding. The Court’s analysis does not consider “conduct by Israel in the Gaza Strip in response to [the] attack carried out on 7 October 2023.”
The UN Independent International Commission of Inquiry on the Occupied Palestinian Territory, including East Jerusalem, and in Israel, established by the UN Human Rights Council, concluded, in a May 2024 report, that members of Hamas “deliberately killed, injured, mistreated, took hostages and committed sexual and gender-based violence against civilians, including Israeli citizens and foreign nationals, as well as members of the Israeli Security Forces (ISF).” According to the Commission, “these actions constitute war crimes,” as well as “violations and abuses of international humanitarian law and international human rights law.” The Commission also concluded that “Israel has committed war crimes, crimes against humanity and violations of [international humanitarian law] and [international human rights law].” It further stated that “Israel has used starvation as a method of war,” “weaponized the withholding of life-sustaining necessities, including humanitarian assistance,” and “perpetrated sexual and gender-based crimes against Palestinians.” With regards to reports and ISF allegations indicating that the military wing of Hamas and other non-State armed groups in Gaza operated from within civilian areas, the Commission “reiterates that all parties to the conflict, including ISF and the military wings of Hamas and other non-State armed groups, must adhere to [international humanitarian law] and avoid increasing risk to civilians by using civilian objects for military purposes.” Additionally, the UN Special Representative of the Secretary General on Sexual Violence in Conflict concluded that clear and convincing information was found that “sexual violence, including rape [and] sexualized torture” was committed against the hostages in the context of the October 7 attacks, and called for “a fully-fledged investigation.” The terrorist attacks and military operations that have led to the death of tens of thousands of people and the dislocation of over two million people, mostly in Gaza, but also in Israel and the occupied West Bank, have generated intense worldwide interest, debate and scrutiny. Much of this has taken place on social media platforms, including Facebook, Instagram and Threads.
According to reporting and research commissioned by the Board, the use of the phrase “From the River to the Sea” surged across social media and in pro-Palestinian protests and demonstrations following the October 7 attacks and Israel’s military operations. The phrase refers to the area between the Jordan River and the Mediterranean Sea, which today covers the entirety of the State of Israel and the Israeli-occupied Palestinian Territories. The phrase predates the October 7 attacks and has a long history as part of the Palestinian protest movement, starting during the partition plan of 1948 adopted by the UN General Assembly. The phrase is tied to Palestinians’ aspirations for self-determination and equal rights (see public comments: Access Now PC-29291; SMEX PC-29396; PC-29211; PC-28564; Jewish Voices for Peace PC-29437). However, the phrase has also been linked in its more recent use to Hamas. The original 1988 Hamas charter called for the destruction of Israel and “seems to encourage the killing of Jews wherever they are found,” (PC-28895). The 2017 Hamas charter included the adoption of the phrase “From the River to the Sea,” which is used by individuals and groups calling for violent opposition to or the destruction of Israel, and the forced removal of Jewish people from Palestine, with variations such as “from the River to the Sea, Palestine will be Arab,” (see public comments: ADL PC-29259; American Jewish Committee PC-29479; NGO Monitor PC-28905; PC-29526; Jewish Home Education Network PC-28694). Another variation of the phrase also appeared in the 1977 platform of Israel’s ruling Likud Party: “Between the Sea and the Jordan there will only be Israeli sovereignty.”
The phrase does not have a single meaning. It has been adopted by various groups and individuals and its significance depends on the speaker, the listener and the context. For some, it is an antisemitic charge denying Jewish people the right to life, self-determination and to stay in their own state, established in 1948, including through forced removal of Jewish people from Israel. As a rallying cry, enshrined in Hamas’s charter, it has been used by the head of the Hamas political bureau Ghazi Hamad, anti-Israel voices, and supporters of terrorist organizations that seek Israel’s destruction through violent means. It is also a call for a Palestinian state encompassing the entire territory, which would mean the dismantling of the Jewish state. When heard by members of the Jewish and pro-Israel community, it may evoke fear and be understood by them as a legitimation or defense of the unprecedented scale of killings, abductions, slaughter and atrocities committed during the October 7 attacks, when Jewish people witnessed an attempted enactment of the aim to annihilate them. The fact that the Jewish population accounts for about 0.2% of the world population (15.7 million people worldwide), half of which are Israeli Jews (about 0.1 % of the world population), enhances this sentiment and a sense of risk and intimidation felt by many Jewish people (see public comments: ADL PC-29259; CAMERA PC-29218; Campaign Against Antisemitism PC-29361; World Jewish Congress PC-29480; American Jewish Committee PC-29479). On the other hand, the estimated number of Palestinians worldwide at the end of 2023 was about 14.6 million people, half of whom live inside Israel or in territories under Israeli occupation. This is partly why many understand the phrase as a call for the equal rights and self-determination of the Palestinian people. At times it is used to indicate support for one or more specific political aims: a single bi-national state on all the territory, a two-state solution for both groups, the right of return for Palestinian refugees, or an end to the Israeli military occupation of Palestinian territories seized in the 1967 war, among other aims. In other contexts, the phrase is a simple affirmation of a place, a people and a history without any concrete political objectives or tactics (see public comments: Access Now PC-29291; SMEX PC-29396; PC-29211; PC-28564; Hearing Palestine Initiative at the University of Toronto PC-28564). After the October 7 Hamas terrorist attacks and the Israeli military campaign in Gaza, it has also been used alongside calls for a ceasefire (see public comments: Jewish Voice for Peace PC-29437; Access Now PC-29291; also Article 19 briefing). For some Palestinians and the pro-Palestinian community, the use of the phrase in the Likud 1977 charter together with recent statements by Benjamin Netanyahu, the party leader, and members of his administration to oppose the creation of a Palestinian state indicates opposition to both a two-state solution and for equal rights of Palestinians, and to call for the expulsion of Palestinians from Gaza and/or the West Bank (see public comments: Access Now PC-29291; Digital Rights Foundation PC-29256).
The Board commissioned external experts to analyze the phrase on Meta’s platforms. The experts’ analysis relied on CrowdTangle, a data analysis tool owned and operated by Meta. CrowdTangle tracks public content from the largest public pages, groups and accounts across all countries and languages, but does not include all content on Meta’s platforms or information about content that was removed by the company. Therefore, instances of the use of the phrase accompanied by violating content (e.g., a direct attack or calls for violence targeting Jewish people and/or Israelis on the basis of a protected characteristic or content supporting a terrorist organization) would be unlikely to be found, because they would probably have been taken down by Meta. In the six months before the October 7 attacks, experts noted more uses of the phrase in Arabic than in English, on Facebook (1,600 versus 1,400, respectively). In the six months that followed October 7, up to March 23, 2024, the use of the phrase in English rose significantly compared with Arabic (82,082 versus 2,880, respectively). According to those experts, the most significant increases in the use of the phrase on Facebook during this period occurred in January and March. On Instagram, the phrase in English has been used significantly more than in Arabic before and after October 7. A big increase was observed in November 2023, at the same time as the Israel Defense Forces’ (IDF) strike on Al-Shifa Hospital, and the growing humanitarian crisis in Gaza. Additionally, the uses of the phrase found by the experts on the platform came as part of posts that either sought to raise awareness about the impact of the war on Palestinians, called for a ceasefire and/or celebrated Palestinian rights to self-determination and equality. Though there were hashtags that became increasingly vocal against the Israeli military, no posts that explicitly called for the death of Jewish people or supported Hamas’s actions on October 7 were identified. The absence of such posts may be the result of such content being removed by Meta.
The phrase has been used as part of anti-war and pro-Palestinian protests across the world, including during the US college campus protests of April to May 2024. As of June 6, 2024, more than 3,000 people had been arrested or detained at demonstrations on campuses in the United States for alleged violations of rules governing campus assemblies. In the majority of such cases, the charges were subsequently dropped. In other countries, there are instances in which officials have sought to ban or cancel protests or to prosecute protesters, due to the use of the phrase (for example, in Vienna , Austria). The Czech city of Prague sought to prohibit a demonstration in November 2023 because of the intended use of the phrase but a municipal court overturned the decision, allowing the demonstration to go ahead. In the United Kingdom, the former Home Secretary encouraged police to interpret the use of the phrase as a violation of law, but the Metropolitan Police declined to adopt a blanket ban. In Germany, the Ministry of the Interior designated the phrase a slogan associated with Hamas. The administrative court in the city of Munster, North Rhine-Westphalia, held that the phrase alone could not be interpreted as incitement because it has multiple meanings. However, the Higher Administrative Court of another state in Germany determined that even though the phrase could have multiple meanings, the court could not set aside a prohibition on its use in an assembly through a preliminary decision, given the order issued by the Ministry of the Interior. In the United States, Resolution 883, which was approved by 377 votes against 44 at the House of Representatives in April 2024, condemns the phrase as “an antisemitic call to arms with the goal of the eradication of the State of Israel, which is located between the Jordan River and the Mediterranean Sea.” The resolution also emphasizes that “Hamas, the Palestinian Islamic Jihad, Hezbollah, and other terrorist organizations and their sympathizers have used and continue to use this slogan as a rallying cry for action to destroy Israel and exterminate the Jewish people.”
Since October 7, the United Nations, governmentagencies and advocacygroups have warned about an increase in both antisemitism and Islamophobia. In the United States, for example, in the three months following October 7, the Anti-Defamation League (ADL) tracked a 361% increase in reported antisemitic incidents – physical assaults, vandalism, verbal or written harassment and rallies that included “antisemitic rhetoric, expressions of support for terrorism against the state of Israel and/or anti-Zionism.” If not accounting for this final category of “rallies that included antisemitic rhetoric, expressions of support for terrorism against the state of Israel and/or anti-Zionism,” which was added by the ADL after October 7, the United States still saw a 176% increase in cases of antisemitism. According to the Council on American-Islamic Relations, during the same three-month period, reports of anti-Muslim and anti-Palestinian discrimination and hate (e.g., employment discrimination, hate crime and incidents, and education discrimination, among other categories outlined in its report, p. 13-15) rose by about 180% in the United States. Comparative data released by the UK’s Metropolitan police on antisemitic and Islamophobic hate crimes in October 2022 versus October 2023 showed an increase in both (antisemitic from 39 to 547 and Islamophobic from 75 to 214, respectively). Some Board Members also consider the fact that Jews are 0.5% and Muslims are 6.5% of the UK population, and that Jews are 0.2% and Muslims 25.8% of the world population, as important context in evaluating these numbers. Countries across Europe have warned of rising hate crimes, hate speech and threats to civil liberties targeting Jewish and Muslim communities. Murder and other forms of very severe violence targeting Palestinians, and attempted murder, rape and other forms of very severe violence targeting Jewish people, have been reported since October 7, 2023.
2. User Submissions
The Facebook users who reported the content and subsequently appealed to the Board claimed the phrase was either breaking Meta’s rules on Hate Speech, Violence and Incitement or Dangerous Organizations and Individuals. The user who reported the content in the first case stated that the phrase violates Meta’s policies prohibiting content that promotes violence or supports terrorism. The users who reported the content in the second and third cases stated that the phrase constitutes hate speech, is antisemitic, and a call for genocide and to abolish the state of Israel.
3. Meta’s Content Policies and Submissions
I. Meta’s Content Policies
Meta analyzed the phrase and the content in the three cases under three policies.
Hate Speech
According to the policy rationale, the Hate Speech Community Standard prohibits “direct attacks against people – rather than concepts or institutions – on the basis of ... protected characteristics: [including] race, ethnicity, national origin [and] religious affiliation.” The company defines “attacks as dehumanizing speech; statements of inferiority, expressions of contempt or disgust; cursing; and calls for exclusion or segregation.”
Tier 1 of the policy prohibits targeting of a person or a group of people on the basis of their protected characteristic using “statements in the form of calls for action or statements of intent to inflict, aspirational or conditional statements about, or statements advocating or supporting harm” with “calls for death without a perpetrator or method” and “calls for accidents or other physical harms caused either by no perpetrator or by a deity.”
Under Tier 2 of the policy, Meta prohibits targeting a person or a group of people on the basis of their protected characteristics with “exclusion or segregation in the form of calls for action, statement of intent, aspirational or conditional statements, or statements advocating or supporting” explicit, political, economic or social exclusion.
Finally, under the section marked “require additional information and/or context to enforce,” the company prohibits “content attacking concepts, institutions, ideas, practices, or beliefs associated with protected characteristics, which are likely to contribute to imminent physical harm, intimidation or discrimination against the people associated with that protected characteristic.”
Violence and Incitement
The Violence and Incitement policy prohibits threatening anyone with “violence that could lead to death (or other forms of high-severity violence).” Additional protections are provided to “persons or groups based on their protected characteristics ... from threats of low-severity violence.” Prior to December 6, 2023, the prohibition against calls for violence was contained in the Hate Speech policy. According to Meta, the decision to move this policy line to the Violence and Incitement policy was part of a reorganization of the Community Standards and did not affect the way in which this policy line is enforced.
Under the section marked “require additional information and/or context to enforce,” the company prohibits “coded statements where the method of violence is not clearly articulated, but the threat is veiled or implicit, as shown by the combination of both a threat signal and a contextual signal.”
Dangerous Organizations and Individuals
After the users who reported the content to Meta appealed to the Board, Meta updated the Dangerous Organizations and Individuals Community Standard (on December 29, 2023). Hamas is a designated entity under Tier 1 of the policy. Prior to the December 29, 2023 update, the policy prohibited “praise” of Tier 1 entities, defined as “speaking positively about” or “legitimizing the cause of a designated entity by making claims that their hateful, violent, or criminal conduct is legally, morally, or otherwise justified or acceptable” or “aligning oneself ideologically with a designated entity or event.” The current policy (as of July 2024) prohibits “glorification” of a Tier 1 entity, including “legitimizing or defending the violent or hateful acts of a designated entity by claiming that those acts have a moral, political, logical or other justification that makes them acceptable or reasonable.”
II. Meta’s Submissions
Meta explained that the standalone phrase “From the River to the Sea” does not violate the company’s Hate Speech, Violence and Incitement, or Dangerous Organizations and Individuals policies. Therefore, it only removes content that contains the phrase if another portion of the content independently violates its Community Standards.
Meta explained that the company has not undertaken a complete development process to collect the views of global stakeholders and experts regarding the phrase, but it did review use of the phrase after the October 7 attacks and Israel’s military response. The company stated it is aware the phrase has a long history. It explained that while somestakeholders view the phrase as antisemitic or a threat to the State of Israel, other stakeholders use the phrase in support of Palestinian people and believe that describing it as antisemitic is “either inaccurate or rooted in Islamophobia.” Because of these differing views, Meta “cannot conclude, without additional context, that the users in the content in question are using the phrase as a call to violence against a group based on their protected characteristics.” Nor could they conclude, “without additional context, that ... the speaker is advocating for the exclusion of a particular group.”
In assessing the phrase under the Dangerous Organizations and Individuals policy, the company determined that “the phrase is not linked exclusively to Hamas. While Hamas uses the phrase in its 2017 charter, the phrase also predates the group and has always been used by people who are not affiliated with Hamas and who do not support its terrorist ideology.” As for the content under review, Meta determined that “none of the three pieces of content in this case bundle suggests support for Hamas or glorifies the organization. Absent this additional context, Meta assesses that this content does not violate our Community Standards.”
In response to the Board’s questions about the research and analysis Meta had undertaken in reaching its conclusions, Meta said that its Policy team reviewed how the phrase was being used on its platforms and assessed it against the Community Standards. The company also conducted some analysis to determine whether to block hashtags containing the phrase. According to Meta, the company will remove a hashtag if it is inherently violating and block a hashtag when a high prevalence of content associated with a hashtag is violating. To make this assessment, Meta’s operations team reviewed content containing hashtags of the phrase and found that only a handful of pieces of content violated Meta’s policies and did so for reasons other than the phrase.
The Board asked Meta whether the company had received government requests to remove content with the phrase and what action the company took in response. Meta informed the Board that the company received a number of requests from government bodies in Germany to restrict access to content in the country under local law. In response, Meta restricted access to the content in Germany.
4. Public Comments
The Oversight Board received 2,412 public comments that met the terms for submission: 60% came from the United States and Canada, 17% from Middle East and North Africa, 12% from Europe, 6% from Asia Pacific and Oceania, and 5% from other regions. To read public comments submitted with consent to publish, click here.
The submissions covered the following themes: the use of the phrase by Hamas and its meaning as an antisemitic call for violence or exclusion; the phrase as protected political speech during an ongoing humanitarian crisis; historical roots of the phrase and evolution of its use, including as a call for Palestinians’ rights to equality and self-determination; the need to assess the phrase contextually to determine its meaning and whether it can be associated with calls for violence; and concerns over the use of automation to moderate content related to the conflict and its negative impact on human rights defenders and journalists.
5. Oversight Board Analysis
These three cases highlight the tension between Meta’s value of protecting voice and the heightened need to protect freedom of expression, particularly political speech in times of conflict, and Meta’s values of safety and dignity to protect people against intimidation, exclusion, violence and real-world harm. This is especially important during violent conflict with an impact on people’s safety, not only in the war zone but worldwide. It is imperative Meta take effective action to ensure its platforms are not used to incite acts of violence. The company’s response to this threat must also be guided by respect for all human rights, including freedom of expression. This is particularly relevant to the current and ongoing iteration of a conflict that followed Hamas’s terrorist attack in October 2023 and Israel’s subsequent military operations, resulting in political protests around the world and accusations made against both sides for violating international law. The surge in antisemitism and Islamophobia is also relevant to the assessment of not only these cases but also general use of the phrase “From the River to the Sea” on Meta’s platforms, given its different meanings, usages and understanding.
The Board notes that, while Meta determined that “the slogan, standing alone, does not violate the Community Standards,” the company “has not conducted research on the prevalence and use of the phrase,” aside from the work that the company’s teams did to understand the use of the phrase in hashtags, as explained in Section 3. Meta did not provide data on content containing the phrase that was taken down due to other violations of its policies. Nonetheless, many public comments received by the Board highlight nuances in the use of this phrase. The Board believes that by giving researchers more access to platform data and investing additional resources in the development of internal research, Meta would enable a better understanding of correlations between online behavior and offline harm.
The Board analyzed Meta’s decisions in these cases against Meta’s content policies, values and human rights responsibilities. The Board also assessed the implications of these cases for Meta’s broader approach to content governance.
5.1 Compliance With Meta’s Content Policies
I. Content Rules
The Board finds that the three pieces of content (one comment and two posts) do not violate Meta’s policies. Because the phrase “From the River to the Sea” can have a wide variety of meanings and interpretations, the Board looked at the content as a whole in these three posts to determine whether a policy was violated.
There is no indication that any of the three pieces of content under review violate Meta’s Hate Speech policy by attacking Jewish or Israeli people with calls for violence or exclusion, or constitute an attack on a concept or an institution associated with a protected characteristic that could lead to imminent violence. While the phrase has been used by some to attack Jewish or Israeli people, these three pieces of content express or contain contextual signals of solidarity with Palestinians and there is no language or signal calling for violence or exclusion. The comment in the first case is on a video encouraging others to “speak up” and which includes a “#ceasefire” hashtag, while the user’s comment contains “#PalestineWillBeFree” and “#DefundIsrael” hashtags, as well as heart emojis in the colors of the Palestinian flag. The post in the second case is a visual representation, seemingly a generated image of floating watermelon slices (watermelon is a symbol of Palestinian solidarity, with the same colors as the Palestinian flag) that form the words of the phrase along with “Palestine will be free,” with no additional caption or visual signals. And the third post expressly states it is in solidarity with Palestinian families fighting to survive, stating support for Palestinians of all faiths.
None of the three pieces of content violates the Violence and Incitement policy as they do not contain threats of violence or other physical harm. As the Board has explained in earlier decisions, Meta requires that a post contains a “threat” and a “target” to violate this policy. The Board finds no indications of a threat in these cases. The Violence and Incitement policy also prohibits “coded statements where the method of violence is not clearly articulated, but the threat is veiled or implicit.” That policy line requires a “threat signal” as well as a “contextual signal” to be enforced. Meta identifies, among its “contextual signals” for enforcement: “local context or expertise confirm[ing] that the statement in question could lead to imminent violence.” Though the Board acknowledges there are instances and settings in which content including the phrase can be used to call for violence, there is no indication the three pieces of content under review could lead to imminent violence.
Additionally, the comment and two posts do not glorify Hamas, a designated organization under Meta’s Dangerous Organizations and Individuals policy, or its actions. None refer to Hamas or use any reference to glorify the organization or its actions. While several public comments have argued for interpreting any use of the phrase as support for Hamas, the majority of the Board rejects this approach and finds that the three pieces of content do not violate Meta’s policy, given the phrase, which was in existence before the establishment of Hamas, does not have a single meaning (see Section 1). Additionally, none of the content refers to the designated entity or attempts to justify the attacks of October 7.
Finally, the Board notes again that the phrase “From the River to the Sea” has multiple meanings, and has been adopted by various groups and individuals, each with different interpretations and intentions. While it can be used by some to encourage and legitimize antisemitism and the violent elimination of Israel and its people, it is also used as a political call for solidarity, equal rights and self-determination of the Palestinian people, and to end the war in Gaza (see Section 2). Given these uses, and as these three cases show, the phrase alone cannot be understood, regardless of context, as a call to violence against a group based on their protected characteristics, advocating for the exclusion of a particular group, or supporting a designated entity or its actions. The use of a phrase by a particular extremist terrorist group with explicit, violent, eliminationist intent and actions does not make the phrase inherently hateful or violent, taking into consideration the variety of actors who use the phrase in different ways. Similarly, the Human Rights Committee, in General Comment 37, addressed the threshold for prohibiting expression based on symbols and emblems that may have multiple meanings and interpretations, stating: “Generally, the use of flags, uniforms, signs and banners is to be regarded as legitimate form of expression that should not be restricted, even if such symbols are reminders of a painful past. In exceptional cases, where such symbols are directly and predominantly associated with incitement to discrimination, hostility, or violence, appropriate restrictions should apply,” ( CCPR/C/GC/37, para. 51).
A minority of the Board believes that, while these three pieces of content do not violate Meta’s policies, the phrase “From the River to the Sea” should be presumed to constitute glorification of Hamas, a designated organization, and be removed unless it is clear the content using the phrase does not endorse Hamas and its aims. For these Board Members, after October 7, the context changed significantly and any ambiguous use of the phrase should be presumed to refer to and endorse Hamas and its actions. The minority agrees that in these three cases, there are clear signals the content does not glorify Hamas or October 7. The reasoning of the minority is provided in greater detail in the human rights analysis section (see Section 5.2).
5.2 Compliance With Meta’s Human Rights Responsibilities
The Board finds that Meta’s decisions to keep the three pieces of content up on Facebook were consistent with the company’s human rights responsibilities. The Board understands that the content at issue in the third case is no longer on Facebook as the user who posted it deleted it from the platform.
Freedom of Expression (Article 19 International Covenant on Civil and Political Rights)
Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides for broad protection of the right to freedom of expression, including “freedom to seek, receive and impart information and ideas of all kinds,” including “political discourse” and commentary on “public affairs,” ( General Comment No. 34, para. 11). The Human Rights Committee has said that the scope of this right “embraces even expression that may be regarded as deeply offensive, although such expression may be restricted in accordance with the provisions of article 19, paragraph 3 and article 20” to protect the rights or reputations of others or to prohibit incitement to discrimination, hostility or violence (General Comment No. 34, para. 11). The broad protection provided to expression of political ideas extends to assemblies with a political message (ICCPR, Article 21; General Comment No. 37, paras 32 and 49). “Given that peaceful assemblies often have expressive functions, and that political speech enjoys particular protection as a form of expression, it follows that assemblies with a political message should enjoy a heightened level of accommodation and protection,” (General Comment No. 37, para 32.) Protests can be conducted online and offline, whether jointly or exclusively. Article 21 extends to protect associated activities that take place online (paras. 6 and 34).
When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s human rights responsibilities in line with the UN Guiding Principles on Business and Human Rights, which Meta itself has committed to in its Corporate Human Rights Policy. The Board does this both in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. The Board agrees with the UN Special Rapporteur on freedom of expression that although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression,” ( A/74/486, para. 41).
As mentioned in Section 1, public comments reflect different views on how international human rights standards on limiting expression should be applied to the moderation of content containing the phrase “From the River to the Sea.” Several public comments argued that Meta’s human rights responsibilities require such content to be removed (see ADL PC-29259), given that the phrase can be identified with extreme calls to eliminate Jewish people. Others argued that it should be removed in contexts in which its spread is likely to give rise to harmful consequences for Jewish people or communities – when, for instance, there are elements suggesting that the speaker identifies with Hamas, or when the phrase is used in conjunction with other cues that connote threats of violence towards Israelis and/or Jewish people, such as “by any means necessary” or “go back to Poland,” (see ACJ PC-29479 and Professor Shany, Hersch Lauterpacht Chair in Public International Law at Hebrew University, former member and Chair of the UN Human Rights Committee PC-28895). Various public comments argued that nothing in the phrase inherently constitutes a call to violence or the exclusion of any group, nor is it linked exclusively to a statement expressing support for Hamas; rather it is primarily rooted in a Palestinian expression for liberation, freedom and equality (see SMEX PC-29396). Some public comments argued that claiming the phrase, in and of itself, carries a genocidal intent relies not on the historical record but rather on racism and Islamophobia (see Hearing Palestine Initiative PC-28564). Other public comments highlighted Meta’s responsibility to provide heightened protection to political speech, restricting content using the phrase only in specific contexts when the speaker is inciting violence, discrimination or hostility (see Human Rights Watch PC-29394).
I. Legality (Clarity and Accessibility of the Rules)
The principle of legality requires rules limiting expression to be accessible and clear, formulated with sufficient precision to enable an individual to regulate their conduct accordingly (General Comment No. 34, para. 25). Additionally, these rules “may not confer unfettered discretion for the restriction of freedom of expression on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not” ( Ibid.). The UN Special Rapporteur on freedom of expression has stated that when applied to private actors’ governance of online speech, rules should be clear and specific (A/HRC/38/35, para. 46). People using Meta’s platforms should be able to access and understand the rules and content reviewers should have clear guidance regarding their enforcement.
The Board finds that as applied to the three pieces of content in these cases, Meta’s policies are sufficiently clear to users.
II. Legitimate Aim
Any restriction on freedom of expression should also pursue one or more of the legitimate aims listed in the ICCPR, which includes protecting the rights of others. The Human Rights Committee has interpreted the term “rights” to include human rights as recognized in the ICCPR and more generally in international human rights law ( General Comment 34, at para. 28).
In several decisions, the Board has recognized that Meta’s Hate Speech policy pursues the legitimate aim of protecting the rights of others. Meta states that it does not allow hate speech because it “creates an environment of intimidation and exclusion, and in some cases may promote offline violence.” It protects the right to life (Article 6, para. 1, ICCPR) as well as the rights to equality and non-discrimination, including based on race, ethnicity and national origin (Article 2, para. 1, ICCPR; Article 2, ICERD). Conversely, the Board has repeatedly noted that it is not a legitimate aim to restrict expression for the sole purpose of protecting individuals from offense (see Depiction of Zwarte Piet, citing UN Special Rapporteur on freedom of expression, report A/74/486, para. 24), as the value that international human rights law places on uninhibited expression is high (General Comment No. 34, para. 38).
The Violence and Incitement policy aims to “prevent potential offline violence” by removing content that includes “violent speech targeting a person or a group of people on the basis of their protected characteristics” and poses “a genuine risk of physical harm or direct threats to public safety.” As previously concluded in the Alleged Crimes in Raya Kobo decision, this policy serves the legitimate aim of protecting the rights of others, such as the right to life (Article 6, ICCPR).
Meta’s Dangerous Organizations and Individuals policy aims to “prevent and disrupt real-world harm.” In several decisions, the Board has found that this policy pursues the legitimate aim of protecting the rights of others, such as the right to life (ICCPR, Article 6) and the right to non-discrimination and equality (ICCPR, Articles 2 and 26), because it covers organizations that promote hate, violence and discrimination as well as designated violent events motivated by hate (see Sudan’s Rapid Support Forces Video Captive and Greek 2023 Elections Campaign decisions).
III. Necessity and Proportionality
Under ICCPR Article 19(3), necessity and proportionality requires that restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected,” (General Comment No. 34, para. 34).
The Board further stresses that Meta has a responsibility to identify, prevent, mitigate and account for adverse human rights impacts, to perform ongoing human rights due diligence to assess the impacts of the company’s activities (UNGPs, Principle 17) and acknowledge that the risk of human rights harms – including the increased danger for vulnerable minorities facing hostility and incitement – is heightened during conflicts (UNGPs, Principle 7, A/75/212, para. 13). The Board has repeatedly highlighted the need to develop a principled and transparent framework for content moderation of hate speech during crises and in conflict settings (see “Two Buttons” Meme, Haitian Police Station Video and Tigray Communication Affairs Bureau decisions). While it is imperative that Meta seeks to prevent its platforms from being used to intimidate, exclude or attack people on the basis of their protected characteristics, or incite acts of terrorist violence – legitimate aims of its content moderation policies – Meta’s human rights responsibilities require any limitations on expression to be necessary and proportionate, not least to respect the voice of people in communities impacted by violence. It is precisely during a rapidly evolving conflict that large social media companies must devote the resources necessary to ensure that freedom of expression is not needlessly curtailed, devoting attention to regions where risks of harm are especially grave. The Board also notes that giving researchers access to platform data and investing additional resources in the development of internal research would allow Meta to better understand correlations between online behavior and offline harm. This would place the company in a better position to fulfill its responsibility to protect human rights under the UNGPs.
The majority of the Board finds that leaving the content up in these three cases is consistent with the principle of necessity, emphasizing the importance of assessing such content in its particular context. While it cannot be denied that in some cases the use of “From the River to the Sea” is intended as and is used along with a call for violence or exclusion, or endorsement of Hamas and its violent acts, given the variety of ways the phrase is used, especially as part of protected political speech, it alone cannot be understood, regardless of context, as a call for violence, intimidation or exclusion.
As part of its analysis, the Board drew upon the six factors (context of the statement, speaker’s position or status, intent to incite, content and form of expression, extent of its dissemination, and likelihood of harm) from the Rabat Plan of Action to evaluate the capacity of both the content in these cases and the standalone phrase “From the River to the Sea” to create a serious risk of inciting discrimination, violence or other lawless actions. The Rabat factors were developed to assess when advocacy of national, racial or religious hatred constitutes incitement to harmful acts, and the Board has used it in this way previously ( Knin Cartoon decision).
Context
The content in these three cases, as well as the broader adoption of the phrase, are in response to an ongoing conflict with significant regional and global consequences. All three pieces of content were posted soon after the October 7 attacks and as Israel’s ground offensive in Gaza was underway.
There are indications in the three pieces of content that the users are responding to or calling attention to the suffering of the Palestinian people and/or condemning the actions of the Israeli military. The significant impact of Israel’s military actions in Gaza, as well as doubts about its legitimacy, have been part of public debate and discussion as well as legal processes before the International Court of Justice and the International Criminal Court. Individuals and groups across the world have sought to influence that discussion, locally and globally. At this time, in a joint statement, the UN Special Rapporteurs in the field of cultural rights, on the right to education, the rights to freedom of peaceful assembly and association, and on the protection and promotion of freedom of opinion and expression, stated that “calls for an end to violence and attacks on Gaza, or for a humanitarian ceasefire, or criticism of Israeli government’s policies and actions, have in too many contexts been misleadingly equated with support for terrorism or antisemitism. This stifles free expression, including artistic expression, and creates an atmosphere of fear to participate in public life.”
More generally, the phrase and the way it is interpreted is heavily influenced by the evolving nature of the conflict and the broader context in the region, as well as globally. That context includes the increase in the use of the phrase both in support, approval or endorsement of Hamas and their violent acts, and its use in support of the Palestinian struggle for self-determination and equal rights, and alongside calls for ceasefire. The context also includes an immense surge in dangerous, dehumanizing and discriminatory rhetoric targeting Arabs, Israelis, Jews, Muslims and Palestinians.
As expressed in public comments, there have been instances of individuals using the phrase in combination with antisemitic calls, threats of violence or expressions of support for Hamas, or justifying the October 7 attacks, when accompanied by statements that call for violence or exclusion, like “by all means necessary” or “go back to Poland,” (see public comment, PC-28895), or alongside other signals of violence, “such as the image of a paraglider which recalls perpetrators of the October 7 attacks,” (see AJC PC-29354). The majority of the Board observes that the removal of violating content could be consistent with Meta’s Community Standards and human rights responsibilities in instances where the context indicates the call is one for violence or exclusion. However, such removal would not be predicated on the phrase in and of itself but rather on contextual clues or other elements present in a post that contains the phrase. Given the different meanings and uses of the phrase, assessment of context and identification of specific risks that can derive from content posted on Meta’s platforms, analyzed as a whole, are vital. Nonetheless, because the phrase does not have a single meaning, a blanket ban on content that includes the phrase, a default rule towards removal of such content, or even using it as a signal to trigger enforcement or review, would hinder protected political speech in unacceptable ways. As stated by various public comments, given the highly contextual nature of its meaning and usage, and the well documented problems automation has in conducting analysis required to understand context, the reliance on automated tools to moderate content using this phrase would “inevitably lead to over-censorship of content on matters of public concern in an ongoing armed conflict,” (see public comments: Human Rights Watch PC-29394; Integrity Institute PC-29544; also Article 19 briefing). This is particularly relevant in the context of the Israel-Gaza conflict, in which, as the Board previously stated in the Hostages Kidnapped From Israel and Al-Shifa Hospital decisions, Meta put in place several temporary measures, including a reduction on confidence thresholds to identify and remove content, which increased automated removal of content where there was a lower confidence score for content violating Meta’s policies. In other words, Meta used its automated tools more aggressively to remove content that might violate its policies.
The Board also notes the prominence of the phrase in pro-Palestinian protests both online and offline across the world. The Board is aware of examples of protesters advocating for violence or praising Hamas, however, according to the Human Rights Committee, under international human rights law, there is a presumption in favor of considering assemblies to be peaceful (General Comment No. 37, CCPR/C/GC/37, paras. 15-17) and violations by some participants do not impact the rights of others. In his report, the UN Special Rapporteur on the rights of freedom of peaceful assembly and of association highlights the importance of the safe and effective exercise of these rights as ensuring “checks and balances,” and as a way of overcoming “entrenched inequalities” so endemic to conflict situations. Exercising the rights of assembly and association is “often the only available option for those who live in post-conflict and fragile contexts to raise their voices; and they are an important avenue for women, victims, youth and marginalized groups, who are otherwise often excluded from these processes to voice their grievances and concerns [and] … bring local grievances to the attention of peacemakers and the international community, which, if they are addressed, can help to resolve the root causes of conflict and prevent furthering or resurging of conflicts,” ( A/78/246, paras. 2-4)
Identity of the Speaker
There is no indication that either the users who posted the content in these three cases, or the pages on which the posts were shared for the second and third cases, are associated with or show support for designated organizations, such as Hamas, or discrimination and exclusion. In his public comment submission, Professor Yuval Shany, for example, identifies, “whether or not the speaker using the phrase identifies himself/herself with Hamas or supports violent act undertaken by Hamas,” (PC-28895) as a relevant indicator under the Rabat analysis.
Intent, Content and Form of Expression
As explained in more detail in Section 5.1, though the phrase “From the River to the Sea” can have a wide variety of uses, the Board finds the three pieces of content under review do not show intent to incite discrimination or violence, advocate for the exclusion of a particular group, or support designated entities or their actions.
The phrase, akin to a slogan, spread very quickly and formed the basis for users to react to the October 7 terrorist attacks and Israel’s military operations in Gaza, with different meanings and intentions. As noted above, according to research commissioned by the Board, there has been a significant surge in the use of the phrase on Meta’s platforms after October 7, with the most significant increases in January and March 2024 on Facebook and in November 2023 on Instagram. The latter took place at the same time as the IDF strike on Al-Shifa Hospital and the growing humanitarian crisis in Gaza. Experts noted that Meta seems to be removing content that includes the phrase when it is accompanied by explicit signals of violence and/or discrimination. The commissioned research relied on CrowdTangle, which does not include all content on Meta’s platforms or content that has been removed by Meta. The research indicates that, for content that was left on Meta’s platforms, the phrase is generally used in posts raising awareness about the impact of the war on Palestinians, calling for a ceasefire or advocating for rights of Palestinians. Nonetheless, as previously mentioned, Meta did not provide data on content containing the phrase that was taken down due to other violations of its policies, nor has it conducted full on-platform data research “on the prevalence and use of the phrase.” The Board acknowledges that the phrase has been and continues to be used in some settings to call for exclusion or violence and may be used in that way on Meta's platforms. However, the Board would need more data to assess the nature and prevalence of content that was removed from Meta’s platforms.
Likelihood and Imminence and Reach
Analyzed as a whole, the Board finds that none of the content in these three specific cases presents a likelihood or risk of imminent violence or discrimination. As stated above, due to its multiple meanings and the varied intentions in its usage, the majority of the Board finds that the phrase itself cannot be inherently understood, in all cases or by default, and regardless of context, as harmful, violent or discriminatory.
While the Board recognizes the phrase “From the River to the Sea” can be used along with threatening language against a Jewish or Israeli person or group, or along with more general threats of violence, or celebration of October 7, (see public comment, AJC PC-29354), and it is imperative that Meta prevent these uses in its platforms, its human rights responsibilities require that in its response to these threats, the company respects all human rights, including the voice of people in communities impacted by violence. The Board finds there is also a significant risk of removing content with the phrase when the content seeks to raise awareness about the suffering of people in Gaza and the dehumanization of Palestinians during an ongoing military campaign. As noted in a public comment, Meta’s platforms are among the most important tools for Palestinians to document the events occurring on the ground, and to seek support from the international community to hold the Israeli military and government accountable, and demand a stop to the violence (see public comment, Hearing Palestine Initiative PC-28564). Meta’s platforms are also vital vehicles to raise global awareness and mobilization in response to rising antisemitism and Islamophobia. The platforms are used to build solidarity, extend support to targeted individuals and groups, raise awareness of bigotry, counter disinformation and provide education. It is essential that these key functions of social media can be carried out in an environment in which people feel safe and respected. Enforcement of Meta’s content policies and continued examinations of the evolution of hateful language and of the relationship between social media and offline harm is essential.
The reach of the first and third pieces of content was low, whereas the post in the second case had about 8 million views. However, the reach of the content is not a factor indicating that removal is necessary when the risk of harm is unclear.
A minority of the Board, however, finds that the context after the October 7 attacks changes significantly the analysis pursuant to the six Rabat factors, and the meaning of the phrase must be determined with this context in mind. While the history and different uses of the phrase are relevant, its role as a statement of a violent program of a designated organization, one on multiple countries’ terrorism lists, means that the connotations of the phrase and the risks of its use have changed (context). For these Board Members, after October 7, the historical ambiguity consideration does not apply any more, and to disregard this new reality is unreasonable and ignores that the phrase can serve as coded endorsement of a designated entity and a hateful ideology that presents risk of harm.
This minority of the Board finds that Meta should adopt a default rule presuming the phrase constitutes glorification of a designated organization unless there are clear signals that the user does not endorse Hamas or the October 7 attacks. Meta should then provide guidance to its content moderators on signals of the non-violating uses of the phrase to be exempted from this default rule. For this minority, adopting this approach would allow Meta to respect the freedom of expression of users who seek to show solidarity with Palestinians and to call for specific political aims, including the equal rights of all people in Israel and the Palestinian Territories, while considering the current risk of violence related to the use of the expression in different local environments.
For the reasons stated above, the majority of the Board disagrees with this approach, given that the phrase, which was in existence before the establishment of Hamas, does not have a single meaning, intent or understanding. Furthermore, they emphasize the advice provided by the UN Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, who has warned of the risks of delegitimizing civil society by “loosely characterizing them as ‘terrorist’ [which increases] the vulnerability of all civil society actors, contributing to the perception that they are legitimate targets of abuse by State and non-State actors,” ( A/HRC/40/52, para. 54). See, for example, the content in summary decision Dehumanizing Comments About People in Gaza. Another minority of the Board feels strongly that drawing attention to the invocation of a phrase that has been adopted by a terrorist group must not be considered tantamount to a claim that individuals posting such content are characterized as terrorists themselves. This minority believes that in adjudicating online content, the provenance and meaning of phrases must be subject to analysis and interpretation; such parsing must not be conflated with efforts to delegitimize civil society actors.
Finally, the Board acknowledges that Meta has designed a set of policies to address the risks from discriminatory content online. The evidence of harm produced by the cumulative widespread and high-speed circulation of antisemitic and other harmful content on Meta’s platforms, as discussed in the Holocaust Denial decision, require that Meta have adequate enforcement tools and measures to moderate such content without unduly curtailing political expression on issues of public interest, in line with its human rights responsibilities. Additionally, if adequately enforced, Meta’s policies provide significant guardrails to advance the goals of preventing violence and other harms resulting from terrorists and their supporters’ uses of Meta’s platforms. In this regard, in response to the Board’s recommendation no. 5 in the Mention of the Taliban in News Reporting case, Meta said it would develop new tools that would allow it to “gather more granular details about our enforcement of the [Dangerous Organizations and Individuals] news reporting policy allowance.” As the Board has previously recommended, this should also be extended to enforcement of the Hate Speech policy ( Holocaust Denial decision, recommendation no. 1), as well as the Violence and Incitement policy ( United States Posts Discussing Abortion decision, recommendation no. 1).
Data Access
The Board and external stakeholders will be in a better position to assess the necessity and proportionality of Meta’s content moderation decisions during ongoing armed conflicts, should Meta continue to provide the Board and independent researchers with access to platform data. In March 2024, Meta announced it would be shutting down CrowdTangle on August 14, 2024. The company explained it would instead focus its resources on “new research tools, Meta Content Library & Content Library API.” While the Board commends Meta for developing new research tools and working to provide greater functionality, the Board is concerned with the company’s decision to shut down CrowdTangle before these new tools can effectively replace it. According to an open letter sent by several organizations to Meta urging the company not to discontinue CrowdTangle “during a key election year,” there are significant concerns about the adequacy of the Meta Content Library to provide sufficient data access for independent monitoring. The European Commission has opened formal proceedings under the Digital Services Act against Facebook and Instagram for the decision to shut down its “real-time public insights tool CrowdTangle without an adequate replacement.” The Board echoes concerns raised by these organizations, individuals and the European Commission with Meta’s decision to discontinue CrowdTangle during a key election year without an adequate replacement.
The Board does note that even with CrowdTangle, there are limits to the Board’s and the public’s abilities to effectively assess the extent of the surge in antisemitic, anti-Muslim, or racist and other hateful content on Meta’s platforms, and where and when that surge may be most prominent. Meta’s transparency reporting is not granular enough to evaluate the extent and nature of hateful content on its platforms. One of the recommendations (no. 16) issued by BSR in its Human Rights Due Diligence report, which was commissioned in response to the Board’s earlier recommendation in the Shared Al Jazeera Post decision, was for the company to develop a mechanism to track the prevalence of content attacking people on the basis of specific protected characteristics (for example, antisemitic, Islamophobic or homophobic content.) In September 2023, one year after the BSR report was issued, Meta reported it was still assessing the feasibility of this recommendation. The Board urges Meta to fully implement this recommendation as soon as possible.
6. The Oversight Board’s Decision
The Oversight Board upholds Meta’s decisions to leave up the content in all three cases.
7. Recommendations
Transparency
1.Meta should ensure that qualified researchers, civil society organizations and journalists, who previously had access to CrowdTangle, are onboarded to the company’s new Content Library within three weeks of submitting their application.
The Board will consider this implemented when Meta provides the Board with a complete list of researchers and organizations that previously had access to CrowdTangle, and the turnaround time it took to onboard them to the Meta Content Library, at least 75% of which should be three weeks or less.
2. Meta should ensure the Meta Content Library is a suitable replacement for CrowdTangle, which provides equal or greater functionality and data access.
The Board will consider this implemented when a survey of a representative sample of onboarded researchers, civil society organizations and journalists shows that at least 75% believe they are able to reasonably continue, reproduce or conduct new research of public interest, using the Meta Content Library. This survey should be carried out longitudinally if necessary, and the results of its first iteration should be shared with the Board no later than Q1, 2025.
3. Meta should implement recommendation no. 16 from the BSR Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine report to develop a mechanism to track the prevalence of content attacking people on the basis of specific protected characteristics (for example, antisemitic, Islamophobic and homophobic content).
The Board will consider this recommendation implemented when Meta publishes the results of its first assessment of these metrics and issues a public commitment on how the company will continue to monitor and leverage those results.
*Procedural Note:
- The Oversight Board’s decisions are made by panels of five Members and approved by a majority vote of the full Board. Board decisions do not necessarily represent the views of all Members.
- Under its Charter, the Oversight Board may review appeals from users whose content Meta removed, appeals from users who reported content that Meta left up, and decisions that Meta refers to it (Charter Article 2, Section 1). The Board has binding authority to uphold or overturn Meta’s content decisions (Charter Article 3, Section 5; Charter Article 4). The Board may issue non-binding recommendations that Meta is required to respond to (Charter Article 3, Section 4; Article 4). When Meta commits to act on recommendations, the Board monitors their implementation.
- For this case decision, independent research was commissioned on behalf of the Board. The Board was assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology. Memetica, a digital investigations group providing risk advisory and threat intelligence services to mitigate online harms, also provided research. Linguistic expertise was provided by Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world.