New Decision Highlights Why Standalone Use of “From the River to the Sea” Should Not Lead to Content Removal
September 4, 2024
Board Also Underscores Importance of Data Access for Independent Monitoring of Meta’s Moderation
In reviewing three cases involving different pieces of Facebook content containing the phrase “From the River to the Sea,” the Board finds they did not break Meta’s rules on Hate Speech, Violence and Incitement or Dangerous Organizations and Individuals. Specifically, the three pieces of content contain contextual signs of solidarity with Palestinians – but no language calling for violence or exclusion. They also do not glorify or even refer to Hamas, an organization designated as dangerous by Meta. In upholding Meta’s decisions to keep up the content, the majority of the Board notes the phrase has multiple meanings and is used by people in various ways and with different intentions. A minority, however, believes that because the phrase appears in the 2017 Hamas charter and given the October 7 attacks, its use in a post should be presumed to constitute glorification of a designated entity, unless there are clear signals to the contrary.
These three cases highlight tensions between Meta’s value of voice and the need to protect freedom of expression, particularly political speech during conflict, and Meta’s values of safety and dignity to protect people against intimidation, exclusion and violence. The current and ongoing conflict that followed the Hamas terrorist attack in October 2023 and Israel’s subsequent military operations has led to protests globally and accusations against both sides for violating international law. Equally relevant is the surge in antisemitism and Islamophobia not only to these cases but also general use of “From the River to the Sea” on Meta’s platforms. These cases have again underscored the importance of data access to effectively assess Meta’s content moderation during conflicts, as well as the need for a method to track the amount of content attacking people based on a protected characteristic. The Board’s recommendations urge Meta to ensure its new Content Library is an effective replacement for CrowdTangle and to fully implement a recommendation from the BSR Human Rights Due Diligence Report of Meta’s Impacts in Israel and Palestine.
About the Cases
In the first case, a Facebook user commented on a video posted by a different user. The video’s caption encourages others to “speak up” and includes hashtags such as “#ceasefire” and “#freepalestine.” The user’s comment includes the phrase “FromTheRiverToTheSea” in hashtag form, additional hashtags such as “#DefundIsrael” and heart emojis in the colors of the Palestinian flag. Viewed about 3,000 times, the comment was reported by four users but these reports were automatically closed because Meta’s automated systems did not prioritize them for human review.
The Facebook user in the second case posted what is likely to be a generated image of floating watermelon slices that form the words from the phrase, alongside “Palestine will be free.” Viewed about 8 million times, this post was reported by 937 users. Some of these reports were assessed by human moderators who found the post did not break Meta’s rules.
For the third case, an administrator of a Facebook page reshared a post by a Canadian community organization, in which the founding members declared support for the Palestinian people, condemned their “senseless slaughter” and “Zionist Israeli occupiers.” With less than 1,000 views, this post was reported by one user but the report was automatically closed.
In all three cases, users then appealed to Meta to remove the content but the appeals were closed without human review following an assessment by one of the company’s automated tools. After Meta upheld its decisions to keep the content on Facebook, the users appealed to the Board.
Unprecedented terrorist attacks by Hamas on Israel in October 2023, which killed 1,200 people and involved 240 hostages being taken, have been followed by a large-scale military response by Israel in Gaza, killing over 39,000 people (as of July 2024). Both sides have since been accused of violating international law, and committing war crimes and crimes against humanity. This has generated worldwide debate, much of which has taken place on social media, including Facebook, Instagram and Threads.
Key Findings
The Board finds there is no indication that the comment or the two posts broke Meta’s Hate Speech rules because they do not attack Jewish or Israeli people with calls for violence or exclusion, nor do they attack a concept or institution associated with a protected characteristic that could lead to imminent violence. Instead, the three pieces of content contain contextual signals of solidarity with Palestinians, in the hashtags, visual representation or statements of support. On other policies, they do not break the Violence and Incitement rules nor do they violate Meta’s Dangerous Organizations and Individuals policy as they do not contain threats of violence or other physical harm, nor do they glorify Hamas or its actions.
In coming to its decision, the majority of the Board notes that the phrase “From the River to the Sea” has multiple meanings. While it can be understood by some as encouraging and legitimizing antisemitism and the violent elimination of Israel and its people, it is also often used as a political call for solidarity, equal rights and self-determination of the Palestinian people, and to end the war in Gaza. Given this fact, and as these cases show, the standalone phrase cannot be understood as a call to violence against a group based on their protected characteristics, as advocating for the exclusion of a particular group, or of supporting a designated entity – Hamas. The phrase’s use by this terrorist group with explicit violent eliminationist intent and actions, does not make the phrase inherently hateful or violent – considering the variety of people using the phrase in different ways. It is vital that factors such as context and identification of specific risks are assessed to analyze content posted on Meta’s platforms as a whole. Though removing content could have aligned with Meta’s human rights responsibilities if the phrase had been accompanied by statements or signals calling for exclusion or violence, or legitimizing hate, such removal would not be based on the phrase itself, but rather on other violating elements, in the view of the majority of the Board. Because the phrase does not have a single meaning, a blanket ban on content that includes the phrase, a default rule towards removal of such content, or even using it as a signal to trigger enforcement or review, would hinder protected political speech in unacceptable ways.
In contrast, a minority of the Board finds that Meta should adopt a default rule presuming the phrase constitutes glorification of a designated organization, unless there are clear signals the user does not endorse Hamas or the October 7attacks.
One piece of research commissioned by the Board for these cases relied on the CrowdTangle data analysis tool. Access to platform data is essential for the Board and other external stakeholders to assess the necessity and proportionality of Meta’s content moderation decisions during armed conflicts. This is why the Board is concerned with Meta’s decision to shut down the tool while there are questions over the newer Meta Content Library as an adequate replacement.
Finally, the Board recognizes that even with research tools, there is limited ability to effectively assess the extent of the surge in antisemitic, Islamophobic, and racist and hateful content on Meta’s platforms. The Board urges Meta to fully implement a recommendation previously issued by the BSR Human Rights Due Diligence report to address this.
The Oversight Board’s Decision
The Oversight Board upholds Meta’s decisions to leave up the content in all three cases.
The Board recommends that Meta:
- Ensure that qualified researchers, civil society organizations and journalists, who previously had access to CrowdTangle, are onboarded to the new Meta Content Library within three weeks of submitting their application.
- Ensure its Content Library is a suitable replacement for CrowdTangle, providing equal or greater functionality and data access.
- Implement recommendation no. 16 from the BSR Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine report to develop a mechanism to track the prevalence of content attacking people based on specific protected characteristics (for example, antisemitic, Islamophobic and homophobic content).
For Further Information
For Arabic translation, click here.
For Hebrew translation, click here.
To read public comments for this case, click here.