Upheld
Communal Violence in Indian State of Odisha
The Oversight Board has upheld Meta’s decision to remove a Facebook post containing a video of communal violence in the Indian state of Odisha.
To read this decision in Odia click here.
ଏହି ନିଷ୍ପତ୍ତିକୁ ଓଡ଼ିଆରେ ପଢ଼ିବା ପାଇଁ ଏଠାରେ କ୍ଲିକ୍ କରନ୍ତୁ।
Case Summary
The Oversight Board has upheld Meta’s decision to remove a Facebook post containing a video of communal violence in the Indian state of Odisha. The Board found that the post violated Meta’s rules on violence and incitement. The majority of the Board also concludes that Meta’s decision to remove all identical videos across its platforms was justified in the specific context of heightened tensions and ongoing violence in the state of Odisha. While the content in this case was not covered by any policy exceptions, the Board urges Meta to ensure that its Violence and Incitement Community Standard allows content that “condemns or raises awareness of violent threats.”
About the Case
In April 2023, a Facebook user posted a video of an event from the previous day that depicts a religious procession in Sambalpur in the Indian state of Odisha related to the Hindu festival of Hanuman Jayanti. The video caption reads “Sambalpur,” which is a town in Odisha, where communal violence broke out between Hindus and Muslims during the festival.
The video shows a procession crowd carrying saffron-colored flags, associated with Hindu nationalism, and chanting “Jai Shri Ram” - which can be translated literally as “Hail Lord Ram” (a Hindu god). In addition to religious contexts where the phrase is used to express devotion to Ram, the expression has been used in some circumstances to promote hostility against minority groups, especially Muslims. The video then zooms into a person standing on the balcony of a building along the route of the procession who is shown throwing a stone at the procession. The crowd then pelts stones towards the building amidst chants of “Jai Shri Ram,” “bhago” (which can be translated as “run”) and “maro maro” (which can be translated as “hit” or “beat”). The content was viewed about 2,000 times and received fewer than 100 comments and reactions.
Following the violence that broke out during the religious procession shown in the video, the Odisha state government shut down internet services, blocked social media platforms, and imposed a curfew in several areas of Sambalpur. In the context of the violence that broke out during the procession, shops were reportedly set on fire and a person was killed.
Shortly after the events depicted in the video, Meta received a request from Odisha law enforcement to remove an identical video, posted by another user with a different caption. Meta found that the post violated the spirit of its Violence and Incitement Community Standard and added the video to a Media Matching Service bank. This locates and flags for possible action content that is identical or nearly identical to previously flagged photos, videos, or text.
Meta informed the Board that the Media Matching Service bank was set up to globally remove all instances of the video, regardless of the caption, given the safety risks posed by this content. This blanket removal applied to all identical videos, even if they fell within Meta’s exceptions for awareness raising, condemnation, and news reporting. The Board noted that, given the settings of the Media Matching Service bank, many pieces of content identical to this video have been removed in the months that followed the events in Sambalpur, Odisha.
Through the Media Matching Service bank, Meta identified the content at issue in this case and removed it, citing its rules prohibiting “[c]alls for high-severity violence including […] where no target is specified but a symbol represents the target and/or includes a visual of an armament or method that represents violence.”
Key Findings
The Board finds that the post violated the Violence and Incitement Community Standard which prohibits “content that constitutes a credible threat to public or personal safety.” The majority of the Board finds that given the ongoing violence in Odisha at the time, and the fact that no policy exceptions applied, the content posed a serious and likely risk of furthering violence. A minority of the Board believes that the post could be properly removed under Meta’s Violence and Incitement Community Standard, but for a different reason. As the video depicted a past incident of incitement with no contextual clues indicating that a policy exception should apply, and similar content was being shared with the aim of inciting violence, Meta was justified in removing the content.
The majority of the Board concludes that Meta’s decision to remove all identical videos across its platforms regardless of the accompanying caption, was justified in the context of ongoing violence at the time. The majority also finds, however, that such broad enforcement measures should be time-bound. After the situation in Odisha changes and the risk of violence associated with the content is reduced, Meta should reassess its enforcement measures for posts containing the video and apply its policy exceptions as usual. In the future, the Board would welcome approaches that limit such sweeping enforcement measures to a moment in time and to geographic areas which are at heightened risk. Such measures would better address the risk of harm without disproportionally impacting freedom of expression.
The minority of the Board finds that Meta’s blanket removal of all posts that included the identical video depicting an incident of incitement, regardless of whether the posts qualified for its awareness raising or condemnation exceptions, was not a proportional response and constituted an undue restriction on expression.
While the content in this case was not covered by any policy exceptions, the Board notes that the “awareness raising” exception under the Violence and Incitement Community Standard is still not available in the public-facing language of the policy. As such, users are still unaware that otherwise violating content is permitted if it is shared to condemn or raise awareness. This may prevent users from engaging in public interest discussions on Meta’s platforms.
The Oversight Board’s Decision
The Oversight Board upholds Meta’s decision to remove the content.
The Board reiterates recommendations from previous cases that Meta:
- Ensure that the Violence and Incitement Community Standard allows content containing statements with “neutral reference to a potential outcome of an action or an advisory warning,” and content that “condemns or raises awareness of violent threats.”
- Provide more clarity to users and explain in the landing page of the Community Standards, in the same way the company does with the newsworthiness allowance, that allowances to the Community Standards may be made when their rationale, and Meta’s values, demand a different outcome than a strict reading of the rules. The Board also reiterates its prior recommendation to Meta to include a link to a Transparency Center page which provides information about the “spirit of the policy” allowance.
* Case summaries provide an overview of cases and do not have precedential value.
Full Case Decision
1. Decision Summary
The Oversight Board upholds Meta’s decision to take down a post of a video on Facebook depicting a scene of communal violence in the state of Odisha in India during the Hanuman Jayanti religious festival. The video shows a procession crowd carrying saffron-colored flags, associated with Hindu nationalism, and chanting “Jai Shri Ram” - which can be translated literally as “Hail Lord Ram” (a Hindu god), and which has been used in some circumstances to promote hostility against minority groups, especially Muslims. The video then zooms into a person standing on the balcony of a building along the route of the procession who is shown throwing a stone at the procession. The crowd then pelts stones towards the building amidst chants of “Jai Shri Ram,” “bhago” (which can be translated as “run”) and “maro maro” (which can be translated as “hit” or “beat”). Meta referred this case to the Board because it illustrates the tensions between Meta’s values of “Voice” and “Safety,” and it requires full analysis of contextual factors and assessment of risks of offline harm posed by the video.
The Board finds that the post violated the Violence and Incitement Community Standard. Given the volatile context and ongoing violence in Odisha at the time the content was posted; both the nature of the religious procession and the calls for high-severity violence in the video; and the virality and widespread nature of similar content being posted on the platform, the majority of the Board finds that the content constituted a credible call for violence.
The minority of the Board believed that the post could be properly removed under Meta’s Violence and Incitement Community Standard but for a different reason. They did not construe the post as a “credible call for violence” absent any contextual clues regarding the purpose of the posting. Rather, they viewed the post as a form of potential “depicted incitement” (i.e., content depicting a past scene of incitement). The minority concluded that the post could be removed under the Violence and Incitement Community Standard because it satisfied two conditions this minority believes must be met to warrant such a removal: 1) there was contextual evidence that postings of similar content were shared with the aim of inciting violence, and 2) the post contained no contextual clues indicating the applicability of a policy exception such as awareness raising or news reporting.
The majority of the Board concludes that considering the challenges of moderating content at scale, Meta’s decision to remove all identical videos across its platforms regardless of the accompanying caption without applying strikes, was justified in the specific context of heightened tensions and ongoing violence in the state of Odisha, in which it was made. The majority also finds, however, that such broad enforcement measures should be time-bound. After the local situation at hand changes and the risk of harm associated with the piece of content under analysis by the Board is reduced, Meta should reassess its enforcement measures, and allow for the application of policy exceptions at scale.
The minority of the Board finds that Meta’s blanket removal of all posts that included the identical video depicting an incident of incitement regardless of whether the posts qualified for its awareness raising or condemnation exceptions was not a proportional response; constituted an undue restriction on expression; and could place vulnerable persons at risk in the midst of a volatile context. The minority is of the view that the content in question is a depiction of incitement rather than incitement in itself. The minority believes a post depicting incitement should not be taken down where contextual clues point to the applicability of an exception to the Violence and Incitement policy. Such exceptions include content that is shared for purposes of spreading awareness or news reporting. The minority believes that where there are indications that the intent behind a posting of depicted incitement content is not to incite but rather to raise awareness, condemn or report, Meta’s human-rights commitments require that such content remain on the platform. The minority therefore believes that mass removal of posts containing the video in question is an impermissible infringement on users’ free expression.
2. Case Description and Background
On April 13, 2023, a Facebook user posted a video of an event from the previous day, April 12, that depicts a religious procession in Sambalpur, in the Indian state of Odisha in the context of the Hindu festival of Hanuman Jayanti. The video caption reads “Sambalpur,” which is a town in Odisha, where communal violence broke out between Hindus and Muslims during the festival. The video shows a procession crowd carrying saffron-colored flags, associated with Hindu nationalism, and chanting “Jai Shri Ram” - which can be translated literally as “Hail Lord Ram” (a Hindu god). In addition to religious contexts where the phrase is used to express devotion to Ram, the expression has been used in some circumstances to promote hostility against minority groups, especially Muslims. Experts consulted by the Board reported that the chant has become “a cry of attack meant to intimidate and threaten those who worship differently.” The video then zooms into a person standing on the balcony of a building along the route of the procession who is shown throwing a stone at the procession. The crowd then pelted stones towards the building amidst chants of “Jai Shri Ram,” “bhago” (which can be translated as “run”) and “maro maro” (which can be translated as “hit” or “beat”). The content was viewed about 2,000 times, received fewer than 100 comments and reactions, and was not shared or reported by anyone.
Communal violence, a form of collective violence that involves clashes between communal or ethnic groups defining themselves by their differences of religion, ethnicity, language or race, is reported to be widespread in India. In this context, violence is disproportionally targeting religious minorities, especially Muslims, and is reportedly met with impunity. Public comments received by the Board highlight the widespread nature of communal violence across India. As of 2022, over 2900 instances of communal violence were registered in the country (see also, PC-14046). Experts consulted by the Board explained that religious festivals and processions have been reportedly used to intimidate members of minority religious traditions and incite violence against them.
In the wake of the violence that broke out during the religious procession and its aftermath, when shops were set on fire and a person was killed, the Odisha state government shut downinternet services, blocked social media platforms, and imposed a curfew in several areas in Sambalpur. The police reportedly made 85 arrests related to the violent events in question.
On April 16, Meta received a request from Odisha law enforcement to remove an identical video, posted by another user with a different caption. Meta found that the post violated the spirit of the Violence and Incitement Community Standard and decided to remove it. Thereafter, on April 17, Meta added the video in the post to a Media Matching Service (MMS) bank, which locates and flags for possible further action content that is identical or nearly identical to previously flagged photos, videos or text. However, the user who posted that content deleted it on that same date before Meta could take action on it. Through the MMS bank, Meta then identified the content at issue in this case and removed it, also on April 17, citing its prohibition of “[c]alls for high-severity violence including […] where no target is specified but a symbol represents the target and/or includes a visual of an armament or method that represents violence.”
On April 23, the Odisha state government lifted the curfew and restored access to internet services. In July 2023, the state government announced a ban on religious processions in Sambalpur for a year.
According to reports, Bharatiya Janatha Party (BJP) state leadership criticized the Odisha state government led by the Biju Janta Dal (BJD) party for its failure to maintain law and order and blamed members of minority groups, particularly Muslims, for attacking peaceful religious processions. The BJD, in turn, accused the BJP of trying to inflame religious tensions.
Meta explained that the content did not fall under a policy exception as it “was not shared to condemn or raise awareness” since there was no academic or news report context, nor discussion of the author’s experience being a target of violence. Additionally, Meta noted that the caption does not condemn nor express “any kind of negative perspective about the events depicted in the video.” The company highlighted, however, that even if the content had included an awareness raising or condemning caption, Meta would still have removed it “given the significant safety concerns and ongoing risk of Hindu and Muslim communal violence.”
Meta also disclosed to the Board that it has configured the MMS bank to remove all instances of the video regardless of the caption accompanying it even if such a caption made clear that the news reporting and/or awareness raising exceptions were implicated. Meta further explained that the company did not apply strikes to users whose content was removed by the MMS bank “to account for non-violating commentary and strike the right balance between voice and safety.”
According to reports, social media platforms have been used to encourage deadly attacks on minority groups amidst rising communal tensions across India. Experts note that there had been coordinated campaigns on social media in India spreading anti-Muslim messages, hate speech or disinformation. They also observed that videos about communal violence had been spread in patterns that bore the earmarks of coordination. After violence broke out in Sambalpur, a video from Argus News, a local media outlet in Odisha, was posted on Facebook at least 34 times within 72 hours, often by pages and groups within minutes of each other, claiming that Muslims were behind the attack on the Hanuman Jayanti celebration in Sambalpur. Additionally, the Board notes that given the settings of the MMS bank, many pieces of content identical to this video have been removed in the months that followed the events in Sambalpur, Odisha.
Meta referred this case to the Board, stating that it is difficult due to the tensions between Meta’s values of “Voice” and “Safety,” and because of the context required to fully assess and appreciate the risk of harm posed by the video.
3. Oversight Board Authority and Scope
The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1).
The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4). Where Meta commits to act on recommendations, the Board monitors their implementation.
4. Sources of Authority and Guidance
The following standards and precedents informed the Board’s analysis in this case:
I. Oversight Board Decisions:
The previous decisions of the Oversight Board referenced in this decision include:
- “United States Posts Discussing Abortion” cases ( 2023-011-IG-UA, 2023-012-FB-UA, 2023-013-FB-UA)
- “Sri Lanka Pharmaceuticals” case ( 2022-014-FB-MR)
- “Russian Poem” case ( 2022-008-FB-UA)
- “Knin Cartoon” case ( 2022-001-FB-UA)
- “Depiction of Zwarte Piet” case ( 2021-002-FB-UA)
- “Protest in India Against France” case ( 2020-007-FB-FBR)
- “Claimed COVID Cure” case ( 2020-006-FB-FBR)
II. Meta’s Content Policies:
Violence and Incitement Community Standard
The policy rationale for the Violence and Incitement Community Standard explains that it aims to “prevent potential offline harm that may be related to content” on Meta’s platforms and that while Meta “understand[s] that people commonly express disdain or disagreement by threatening or calling for violence in non-serious ways, [the company] remove[s] language that incites or facilitates serious violence.” The policy prohibits “[t]hreats that could lead to death (and other forms of high-severity violence) ...targeting people or places where threat is defined as” “calls for high-severity violence including content where no target is specified but a symbol represents the target and/or includes a visual of an armament or method that represents violence.” Under this Community Standard Meta “remove[s] content, disable[s] accounts, and work[s] with law enforcement when [Meta] believe[s] there is a genuine risk of physical harm or direct threats to public safety.” Meta also considers the context “in order to distinguish casual statements from content that constitutes a credible threat to public or personal safety.” In assessing whether a threat is credible, Meta considers additional information such as the “person’s public visibility and the risks to their physical safety.”
Spirit of the policy allowance
As the Board discussed in the “ Sri Lanka Pharmaceuticals” case, Meta may apply a “spirit of the policy” allowance to content when the policy rationale (the text that introduces each Community Standard) and Meta’s values demand a different outcome than a strict reading of the rules (i.e., the rules set out in the “do not post” and in the list of prohibited content).
The Board’s analysis of content policies was informed by the Meta’s value of “Voice,” which the company describes as “paramount,” and its value of “Safety.”
III. Meta’s Human-Rights Responsibilities
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human-rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The Board's analysis of Meta’s human-rights responsibilities in this case was informed by the following international standards:
- The rights to freedom of opinion and expression: Articles 19 and 20, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee (2011); Rabat Plan of Action, UN High Commissioner for Human Rights report: A/HRC/22/17/Add.4 (2013) UN Special Rapporteur (UNSR) on freedom of opinion and expression, reports: A/HRC/38/35 (2018); A/74/486 (2019).
- Right to life: Article 6, ICCPR.
- Freedom of religion or belief: Article 18, ICCPR; UN Special Rapporteur on freedom of religion or belief, reports: A/HRC/40/58 (2019); A/75/385 (2020).
5. User Submissions
The author of the post was notified of the Board’s review and provided with an opportunity to submit a statement to the Board. The user did not submit a statement.
6. Meta’s Submissions
When referring this case to the Board, Meta stated that it is difficult due to the tensions between Meta’s values of “Voice” and “Safety,” and because of the context required to fully assess and appreciate the risk of harm posed by the video. Meta stated that this case is significant because of the communal clashes between Hindu and Muslim communities during the Hanuman Jayanti religious festival in Odisha.
Meta explained that the originally escalated content – a post with a video identical to the one under analysis by the Board, but with a different caption – violated the “spirit” of the Violence and Incitement policy, despite the fact that it contained an awareness raising caption because: (1) it “raised significant safety concerns that were flagged by law enforcement” which Meta confirmed through independent analysis; (2) it was going viral; and (3) it triggered a significant number of violating comments. Meta then configured a MMS bank to remove all instances of the video regardless of the caption, which included the video ultimately referred to the Board, given the safety risks posed by this content.” In reaching this decision, Meta was able to independently corroborate the concerns raised by law enforcement based on feedback from Meta’s local public policy and safety teams, as well as local news coverage and feedback from other internal teams. In making its decision, the company considered: (1) the nature of the threat; (2) the history of violence between Hindu and Muslim communities in India; and (3) the risk of continuing violence in Odisha in the days leading up to the Hanuman Jayanti religious festival. Additionally, Meta stated that news coverage and local police reports reinforced the conclusion that this video could contribute to a risk of communal violence and retaliation.
Meta also explained that the content in this case violated the Violence and Incitement policy, since it includes a call for high-severity violence, as the video shows stones or bricks being thrown into a crowd and the crowd calling on others to “hit” or “beat” someone in response. Moreover, while Meta acknowledges that the target is not expressly identified, viewers can clearly see stones being thrown towards the building and the individual on the balcony, which Meta considers a visual depiction of a method of violence directed towards a target.
In response to the Board’s questions, Meta explained that under the letter of the Violence and Incitement policy, otherwise violating content may be allowed on the platform if the content is shared in a condemning or an awareness raising context. However, as one of the main purposes of the Violence and Incitement policy is to “prevent potential offline harm,” in this case, Meta determined that the safety concern that the originally escalated content could contribute to a risk of further Hindu and Muslim communal violence merited a spirit of the policy call to remove it (and all other instances of the video on their platforms), irrespective of the content’s caption.
Meta also determined the content did not qualify for a newsworthiness allowance, as the risk of harm outweighed its public interest value. According to Meta, the risk of harm was high for several reasons. First, the content highlights ongoing religious and political tensions between Hindus and Muslims that regularly result in violence across India. Moreover, localized incidents of this type of communal and religious violence have the potential to trigger clashes elsewhere and spread quickly beyond the initial location. Meta’s local public policy and safety teams were also concerned about the risk of recurring violence in Odisha once the curfew and internet suspension were lifted and more people could view the video. Finally, local law enforcement’s identification of the content as likely to contribute to further imminent violence corroborated Meta’s concerns.
Meta acknowledged there may be value associated with attempts to notify others of impending violence and current events. However, in this case, Meta found that the risk of harm outweighed the public interest. Meta noted that at the time the content was removed, the post was more than four days old and its value as a real-time warning had diminished. Meta underlined that the post had a neutral caption, which did not have a higher informational value. Meta also mentioned that the caption didn’t lessen the risk of the content inciting violence. According to Meta, there was widespread local and national news coverage of the underlying events in this case, which diminished the informative value of this particular post. Meta also informed the Board that “no action short of removing the content could adequately address the potential risks associated with sharing this content.”
In response to the Board’s questions, Meta noted that “in general, strikes are applied at scale for all Violence and Incitement policy violations.” But, on escalation, Meta can decide not to apply strikes based on exceptional circumstances including where the content was posted in an awareness-raising context, or the content seeks to condemn an issue of public importance.
Meta explained that the company did not apply a strike to content removed through the MMS bank mentioned above to “effectively balance voice and safety and to account for the fact that some content the bank removed would not have violated the letter of the policy.” As previously explained in this section, Meta’s decision to remove the originally reported content was based on the “spirit of the Violence and Incitement policy.” Meta added that as MMS banks were involved, there was no opportunity to review each piece of content individually as would be done at scale. Therefore, Meta did not apply strikes in order to not further penalize users who posted content that did not violate the letter of the policy.
In response to the Board’s questions on government requests, Meta mentioned the information provided in its Transparency Center. Meta explained that, when a formal report based on a violation of local law is received from a government or local law enforcement, it is first reviewed against Meta’s Community Standards, even if it includes requests to remove or restrict content for violating local laws. If Meta determines that the content violates its policies, it is removed. However, if not, then Meta conducts a legal review to confirm whether the report is valid and performs human rights due diligence consistent with Meta’s Corporate Human Rights Policy.
The Board asked Meta 16 questions in writing. Questions related to Meta’s processes for government requests for content review, Meta’s usage of MMS banks for at scale enforcement and account level enforcement practices. Meta answered 15 questions and declined to provide a copy of the content review request received from the Odisha state law enforcement in this case.
7. Public Comments
The Oversight Board received 88 public comments relevant to this case: 31 of the comments were submitted from Asia Pacific and Oceania, 42 from Central and South Asia, eight from Europe, one from Latin America and the Caribbean, one from Middle East and North Africa, and five from United States and Canada. This total includes 32 public comments that were either duplicates, were submitted without consent to publish or were submitted with consent to publish, but did not meet the Board’s conditions for publications. Public comments can be submitted to the Board with or without consent to publish, and with or without consent to attribute.
The submissions covered the following themes: social and political context in India, particularly with regards to different ethnic and religious groups; relevant government policies and treatment of different ethnic and religious groups; the role of social media platforms, particularly Meta platforms, in India; whether content depicting communal violence in Odisha was likely to incite offline violence; how social media companies should treat government requests to review and/or remove content; importance of transparency reporting, especially with regards to government requests; the role of media and communications in the increase of violence and discrimination in India; importance of analyzing contextual cues and offline signals when assessing how likely a piece of content is to incite offline violence; concerns around coordinated online disinformation campaigns aimed at spreading hate against specific ethnic and religious groups.
To read public comments submitted for this case, please click here.
The Board also filed Right to Information requests with several State and Central Indian authorities. The received responses were limited to information about the local context at the time the content under review in this case was posted and prohibitory measures in Sambalpur, Odisha.
8. Oversight Board Analysis
The Board analyzed Meta's content policies, human-rights responsibilities and values to determine whether this content should be removed. The Board also assessed the implications of this case for Meta’s broader approach to content governance, particularly in contexts involving ongoing communal violence.
The Board selected this case as an opportunity to assess Meta’s policies and practices in moderating content that depicts instances of communal violence. Additionally, it provides the Board with an opportunity to discuss the nature of online incitement and provide guidance to Meta on how to address it. Finally, the case allows the Board to examine Meta’s compliance with its human-rights responsibilities in crisis and conflict situations more generally.
8.1 Compliance With Meta’s Content Policies
I. Content Rules
Violence and Incitement
The Board finds that the post violated the Violence and Incitement Community Standard, under which Meta removes “content that constitutes a credible threat to public or personal safety.” In particular, the policy prohibits “[t]hreats that could lead to death (and other forms of high-severity violence) ...targeting people or places where threat is defined as” “[c]alls for high-severity violence.” Under this policy, content containing calls to violence is considered to be violating when it contains a credible threat. In determining whether a threat is credible, Meta considers the language and context to distinguish threats from casual statements. The majority found the following factors relevant: the volatile context and ongoing violence in Odisha at the time the content was posted; the nature of the religious procession; the calls for high-severity violence in the video; and the virality and widespread nature of similar content being posted on the platform (as outlined in Section 2 above). Based on these factors, the majority of the Board finds that the content constituted a credible call for violence.
The content in this case depicts a scene of violence in which a crowd in the religious procession calls for people to throw stones/bricks (“high-severity violence”) against an unidentified person standing on the balcony of the building seen in the background (“target”). Meta includes under the definition of “target” provided to content reviewers any “person,” including anonymous persons, defined as “a real person that is not identified by name or imagery.” Meta defines “high-severity violence” as “any violence that is likely to be lethal. Meta instructs its content reviewers to “consider a threat as high severity,” if they’re unsure “whether a threat is high or mid severity.” Given that all the requirements are met, the majority of the Board finds that the content violates the relevant policy line of the Violence and Incitement Community Standard.
Contextual factors are significant in this case. Stone pelting incidents have been widespread and organized during processions and have been observed to trigger Hindu-Muslim violence (See e.g., PC-14070), especially when Hindu and Muslim religious festivals overlap. As noted in Section 2 above, these processions have been reported to display symbols associated with Hindu nationalism (e.g., saffron-coloured flags) and to be accompanied by coded calls for violence (the chanting of “Jai Shri Ram”) against minority groups, particularly Muslims. Moreover, the Board is aware that social media platforms – and specifically the sharing of videos that depict acts of incitement – are used, in this context, to mobilize and incite more widespread violence, especially through “live” and video posts (Id.) akin to the one at issue in this case. The risk of high-severity violence was heightened in this case as the rally and the instigated violence resulted in a fatality, injuries, and property damage, as highlighted under Section 2 above. Thus, the content was likely to further high-severity violence.
Despite the government-imposed internet shutdown in Odisha, the Board takes note of the fact that many postings of the same video have been removed from Meta’s platforms, given the MMS bank’s settings. Interestingly, Meta informed the Board that the originally escalated video flagged by Odisha law enforcement “was going viral” when it was reviewed and included “a significant number of violating comments.” As noted in Section 2 above, there have been reports of coordinated campaigns aimed at spreading anti-Muslim disinformation and hate speech.
In the implementation guidelines to its content reviewers, Meta allows “violating content if it is shared in a condemning or raising awareness context.” Meta defines awareness raising context as “content that clearly seeks to inform and educate others about a specific topic or issue; or content that speaks to one’s experience of being a target of a threat or violence. This might include academic and media reports.” Meta told the Board that “these allowances are designed to limit the spread of content that incites violence and could have consequences offline while still allowing space for counter-speech that is not supportive but is intended to educate or warn people about threats made by third parties.”
The Board notes that while the user shared the content shortly after violence broke out in Sambalpur, Odisha, it was accompanied with a neutral caption (“Sambalpur” – the name of the town where the violent events took place). Given the neutral caption and the lack of contextual cues pointing towards a different direction, the Board concludes that the content didn’t “clearly seek to inform and educate others about a specific topic or issue” or “speak to one’s experience of being a target of a threat or violence.” The Board finds that the content as posted did not fall under the awareness-raising exception to the Violence and Incitement Community Standard. Given the risk of harm, as discussed in Section 8.2 below, the Board considered that the risk of harm outweighed the public interest value of the post. Therefore, the newsworthiness allowance should not be applied in this case.
The majority of the Board therefore concludes that given the online and offline context surrounding the posting of the content, the heightened tensions and violence that were still ongoing in Sambalpur, Odisha at the time of the posting, and the lack of any indication of the applicability of any policy exception, the content posed a serious and likely risk of furthering violence, constituting a credible threat or call for violence against religious communities engaging in confrontation in Sambalpur. Thus, its removal is consistent with Meta’s Violence and Incitement policy.
In contrast to the majority, the minority could not identify any contextual indications supporting the belief that the reposting of the video depicting a scene of a motorcycle procession in Odisha during the Hanuman Jayanti religious festival “constituted a credible call for violence.” The minority notes that there is no evidence to support the assertion that the user was issuing or endorsing such calls as voiced in the video. To interpret a post of this nature, without more, as a “credible call for violence” is a standard that could be applied to prohibit the reposting of virtually any scene depicting incitement, no matter the aim or purpose of such a post.
However, the minority believes the post in this case could be properly removed under Meta’s Violence and Incitement Community Standard for a different reason. The minority notes that the Violence and Incitement Community Standard is silent on whether posts of “depicted incitement” are banned. In the view of the minority, “depicted incitement” which constitutes the repetition, replaying, recounting or other depiction of past expression (e.g. the posting of a video, news story, audio clip or other content) cannot properly be considered a form of incitement in itself. “Depicted incitement” differs materially from original incitement, namely expression conveyed with the intent and result of inciting harm (e.g. a video extorting listeners to commit vandalism or a written post encouraging revenge attacks). Posts involving depicted incitement may be shared in order to raise awareness, debate recent events, condemn or analyze, and must not be construed to constitute incitement unless specific conditions are met.
Whereas the Violence and Incitement Community Standard explicitly bans depictions of past acts of kidnapping, it does not address depictions of past acts of incitement. One could interpret the Standard as not covering past depictions of incitement. The minority, however, concludes that the Violence and Incitement policy may properly be applied to “depicted incitement” when either of the following conditions are met: 1) the posting of depicted incitement evinces a clear intent to incite; or 2) the posting a) contains no contextual clues indicating the applicability of a policy exception such awareness raising or news reporting; and b) there is evidence that postings of similar content are shared with the aim of inciting violence or result in violence. The conditions spelled out in (2) were met in this case, thus rendering the content removal permissible.
The minority of the Board believes it would be important for the Violence and Incitement Community Standard to be clarified to state that the policy applies not only to content posted to incite violence, but also to “depicted incitement,” namely posts merely sharing content depicting past incitement under the above-mentioned conditions.
II. Enforcement Action
Meta employs MMS banks to locate content that is identical or nearly identical to previously flagged photos, videos, and text. These banks are able to match users' posts with content previously flagged as violating by Meta's internal teams.
Meta explained that MMS banks can be configured to take different actions once they identify banked content. In this case, Meta informed the Board that the MMS bank was set up to globally remove all instances of the video regardless of the caption, given the safety risks posed by this content. In other words, the blanket removal applied to all identical videos even if they fell within Meta’s exceptions for awareness raising, condemnation, and news reporting. Meta also mentioned that this particular MMS bank was “set up to take action without applying a strike.”
The Board highlights the significant impact Meta’s enforcement action has on users posting identical content for awareness raising and condemnatory purposes. Meta informed the Board that the company’s decision to remove all instances of the video was not time-limited (nor limited to certain geographic locations), and that there are no current plans to roll back the enforcement. The Board addresses Meta’s enforcement action in more detail below, under Section 8.2 in the context of the “Necessity and Proportionality” analysis.
8.2 Compliance With Meta’s Human-Rights Responsibilities
Freedom of Expression (Article 19 ICCPR)
Article 19, para. 2, of the ICCPR provides for broad protection of “the expression and receipt of communications of every form of idea and opinion capable of transmission to others,” including about “political discourse,” “religious discourse” and “journalism,” as well as expression that people may find "deeply offensive” ( General Comment No. 34, (2011), para. 11). The right to freedom of expression includes the right to access information (General Comment no. 34, (2011), paras. 18-19).
Where restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” This three-part test has been proposed by the UN Special Rapporteur on freedom of expression as a framework to guide platforms’ content moderation practices ( A/HRC/38/35). The Board uses this framework to interpret Meta’s voluntary human-rights commitments, both in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of expression has stated, although "companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users' right to freedom of expression" ( A/74/486, para. 41). In this case, the Board applied the three-part test not only to Meta’s decision to remove the content at issue, but also the company’s decision to automatically remove videos identical to the one under analysis by the Board, regardless of the accompanying caption.
I. Legality (Clarity and Accessibility of the Rules)
The principle of legality under international human rights law requires rules that limit expression to be clear and publicly accessible (General Comment No.34, para. 25). Rules restricting expression "may not confer unfettered discretion for the restriction of freedom of expression on those charged with [their] execution" and "provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not" (Ibid). Applied to rules that govern online speech, the UN Special Rapporteur on freedom of expression has said they should be clear and specific ( A/HRC/38/35, para. 46). People using Meta’s platforms should be able to access and understand the rules and content reviewers should have clear guidance on their enforcement.
The Board finds that, as applied to the facts of this case, Meta’s prohibition of content calling for high-severity violence against unspecified targets as well as the conditions under which the prohibition is triggered are sufficiently clear.
The Board also notes that the “awareness raising” exception under the Violence and Incitement Community Standard is still not available in the public-facing language of the policy. In other words, users are still unaware that otherwise violating content is permitted if it is shared in a condemning or raising awareness context, which may prevent users from initiating or engaging in public interest discussions on Meta’s platforms. Therefore, the Board reiterates its recommendation no. 1 in the “ Russian Poem” case, in which the Board urges Meta to add to the public-facing language of its Violence and Incitement Community Standard that the company interprets the policy to allow content containing statements with “neutral reference to a potential outcome of an action or an advisory warning,” and content that “condemns or raises awareness of violent threats.”
Finally, the Board notes that Meta’s decision to remove all identical videos regardless of accompanying caption is based on the “spirit of the policy” allowance, which is not clear and accessible to users, thereby triggering serious concerns under the legality test. In this regard, the Board’s minority further finds that Meta’s own reference to mass removals being justified by the “spirit” of the Violence and Incitement policy as a tacit admission that the policy itself as written does not provide for such broad removals. The company’s decision not to apply strikes against users on the basis of the removed content further evinces recognition by Meta that the posting of such content cannot fairly be construed as a policy violation. These factors reinforce the minority’s conclusion that the company, virtually by its own admission, has failed to meet the legality test in relation to its broader enforcement action in this case.
The Board reiterates its recommendation no. 1 in the “ Sri Lanka pharmaceuticals” case, in which the Board urged Meta to provide more clarity to users and explain in the landing page of the Community Standards, in the same way the company does with the newsworthiness allowance, that allowances to the Community Standards may be made when their rationale, and Meta’s values, demand a different outcome than a strict reading of the rules. The Board also recommended Meta to include a link to a Transparency Center page which provides information about the “spirit of the policy” allowance. The Board believes that the implementation of this recommendation will address the issues of concern in relation to the clarity and accessibility of Meta’s broader enforcement approach in this case.
While the Violence and Incitement policy does not specify whether “depicted incitement” is prohibited, the minority of the Board believes that such a prohibition – under limited conditions – may be inferred from the current policy but should be made explicit. The minority notes that the Violence and Incitement policy should clearly state the circumstances under which it applies to content merely depicting incitement (“depicted incitement”). The minority considers that the policy and its purposes, as applied to this case, are sufficiently clear to satisfy the legality requirement.
II. Legitimate Aim
Any restriction on freedom of expression should also pursue a "legitimate aim". The Violence and Incitement policy aims to “prevent potential offline harm” and removes content that poses “a genuine risk of physical harm or direct threats to public safety.” Prohibiting calls for violence on the platform to ensure people’s safety constitutes a legitimate aim under Article 19, para.3, as it protects the “rights of others” to life (Article 6, ICCPR) and freedom of religion or belief (Article 18, ICCPR).
III. Necessity and Proportionality
The principle of necessity and proportionality provides that any restrictions on freedom of expression "must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected" (General Comment No. 34, para. 34).
When analyzing the risks posed by violent content, the Board is guided by the six-factor test described in the Rabat Plan of Action, which addresses advocacy of national, racial or religious hatred that constitutes incitement to hostility, discrimination or violence. Based on an assessment of the relevant factors, especially the context, content and form, as well as likelihood and imminence of harm, further described below, the Board finds that removing the post in question is consistent with Meta’s human-rights responsibilities as it posed imminent and likely harm.
The video shows a scene of violence during a religious procession between a person standing on a nearby building and the people in the rally with the latter chanting “Jai Shri Ram’. The Board takes note of the expert reports, discussed under Section 2 above, that “Jai Shri Ram” - which can be literally translated as “Hail Lord Ram” (a Hindu god) - has been used in religious processions as those depicted in the video as a coded expression to promote hostility against minority groups, especially Muslims. The user posted the content one day after violence broke out in Sambalpur at a moment when the situation was still volatile. The Board also notes, as highlighted under Section 2 above, that this religious rally in Sambalpur, Odisha led to violence and a fatality, and these events were followed by arrests and internet shutdowns. The Board is aware of the relationship between religious processions and communal violence and highlights that stone pelting during processions is reported to have a widespread and organized nature that have been observed to trigger Hindu-Muslim violence (See e.g., PC-14070).
Given the online and offline context surrounding the posting of the content, the heightened tensions and violence that were still ongoing in Odisha in the period when the video was posted, and the lack of any indication of the applicability of any policy exception, the Board finds that removing the post under the Violence and Incitement Community Standard was necessary and proportionate. Considering the volatile context in Odisha at the time the post was created, the video posed a serious and likely risk of furthering violence.
The Board therefore agrees with Meta’s removal of the video in this case, given the contextual factors and lack of a clear awareness raising purpose (as discussed in Section 8.1 above). The Board also notes that the company added the video in the originally escalated content to a MMS bank configured to remove all similar posts containing that same video regardless of the caption accompanying the posts. That includes posts with awareness raising, condemnation and/or reporting purposes – exceptions to the Violence and Incitement Community Standard.
A majority of the Board believes that the challenges of moderating content at scale are very relevant to the assessment of this broader enforcement decision. Meta made this decision to remove content that posed a serious and likely risk of furthering violence in a moment of heightened tension and violence. In such moments, the timeliness of Meta’s enforcement actions is of the essence. As the Board has previously emphasized, mistakes are inevitable among the hundreds of millions of posts that Meta moderates every month. While mistaken removals of non-violating content (false positives) negatively impact expression, mistakenly leaving up violent threats and incitement (false negatives) presents major safety risks and can suppress the participation of those targeted (see “ United States Posts Discussing Abortion” cases). Given the scale of the risks to safety that surrounded the posting of this video, in a period of heightened tensions and ongoing violence in Odisha, Meta’s decision to take down identical videos regardless of any accompanying caption, without applying strikes to penalize users, was necessary and proportional to address the potential risks of this content being widely shared.
In addition to the contextual factors highlighted in Sections 2 and 8.1 above, the majority of the Board notes that many pieces of identical videos have been removed from Meta’s platforms due to the MMS bank settings, despite the government-imposed internet shutdown in Sambalpur. In particular, according to Meta’s decision rationale, the originally escalated video “was going viral” and included “a significant number of violating comments.” Under Section 2 above, the Board points out reports highlighting that there are coordinated campaigns in India spreading hate speech and disinformation against Muslims. In the same section, the Board also takes note of reports indicating that videos about communal violence had been spread in patterns that bore the earmarks of coordination. In this regard, the majority notes that as stated by the Special Rapporteur on freedom of religion or belief, “[s]ocial media platforms are increasingly exploited as spaces for incitement to hatred and violence by civil, political and religious actors”. Relatedly, concerns “about the spread of real and constructed hate against religious minorities have been raised in India” ( A/75/385, para. 35). The majority recognizes the history of frequent and widespread violence targeting Muslims, which are reportedly conducted with impunity.
This majority acknowledges the challenges Meta faces in removing threats of violence at scale (see “Protest in India Against France” case). When analyzing the difficulties of enforcing Meta’s policies at scale, the Board has previously emphasized that dehumanizing discourse that consists of implicit or explicit discriminatory acts or speech may contribute to atrocities. To forestall such outcomes, Meta can legitimately remove posts from its platforms that encourage violence (see “Knin Cartoon” case). In interpreting the Hate Speech Community Standard, the Board has also considered that, in certain circumstances, moderating content with the objective of addressing cumulative harms caused by hate speech at scale may be consistent with Meta's human-rights responsibilities. This position holds even when specific pieces of content, seen in isolation, do not appear to directly incite violence or discrimination (see “Depiction of Zwarte Piet” case). For the majority of the Board, the same can be said, given the specific context of this case, in relation to the Violence and Incitement policy.
The majority of the Board, however, notes that broad enforcement measures such as Meta’s MMS bank approach should be time-bound. After the situation in Odisha changes and the risk of violence associated with this piece of content is reduced, Meta should reassess enforcement measures adopted to moderate posts containing the video added to the MMS bank to ensure that policy exceptions are applied as usual. In the future, the Board would welcome approaches that limit such sweeping enforcement measures to a particular moment in time and to heightened-risk geographic areas so that such measures are better tailored to address the risk of harm without disproportionally impacting freedom of expression.
The minority of the Board, however, does not believe that Meta’s mass blanket removal throughout the world of all identical videos depicting a past incident of incitement regardless of whether the videos were shared for awareness raising (e.g., by a news outlet) or condemnation was a proportional response. That a city or population is experiencing communal violence cannot, in of itself, constitute grounds for such sweeping restrictions on free expression in the name of avoiding furthering such violence. This is particularly so absent a showing or even grounds to believe that such restrictions will have the result of lessening violence.
In situations of violent conflict, the imperative of awareness raising, sharing information and preparing communities to react to important events affecting them is paramount. Overly aggressive enforcement runs the risk of leaving vulnerable communities in the dark as to timely events, creating the potential for rumors and disinformation to spread. Indeed, it is dangerous to assume that voice and safety are necessarily clashing goals, and one must be sacrificed for the other. Rather, they are frequently deeply intertwined: while the spread of content intended to incite may result in increased risks of offline violence, suppressing information can lead to undermining safety, often that of vulnerable populations. Such mass blanket removals also run the risk of disproportionately affecting the speech of particular parties to a conflict, in ways that may heighten tensions and fuel the impetus to violence. Such omnibus removals can place individuals in a situation of being forcibly silenced at a time when they most urgently need to cry out for help or at least bear witness. In situations of violent conflict, there is an urgent need for readily accessible information and dialogue, for which Meta platforms offer a primary venue. A conclusion that situations of violent conflict can, in themselves, justify sweeping restrictions on free expression would be welcome news to authoritarian governments and powerful non-state actors who engage in such violence, and have an incentive to prevent the world from knowing, or delaying awareness until the powers have achieved their purposes.
The minority further believes that a broad policy of removing all content depicting incitement would interfere with the vital role of news organizations in covering global events, limiting the distribution of their news content on Meta platforms when the events depicted therein included past incitements to violence. The awareness raising potential of the timely dissemination of such information can play an essential role in tempering violence or rallying opposition to it. The minority is also concerned that the blanket takedown of posts depicting incitement could impair efforts to identify and hold accountable those responsible for real-world incitement to violence that occurs off the platform. Given that the virality and reach of a Meta post occurs mostly in the hours and days right after the post is shared, the minority does not believe that, even on a time-bound basis, the blanket prohibition of “depicted incitement” on the platform is compatible with Meta’s values and human-rights commitments. The aggressive policing of content, without regard to the motives and context in which it is posted, constitutes an abdication of Meta’s responsibility to uphold the company’s own foremost commitment to voice, and its international human-rights commitment to protect freedom of expression. The minority is concerned that the majority’s reasoning could be embraced by repressive governments to legitimize self-serving orders of internet shutdowns and other forms of information suppression in the name of preventing what might be termed depictions of incitement, but amount to timely, potentially life-saving information about violence toward civilians, or minority groups.
For the minority, the Board should not readily defer to Meta’s mere assertions of the challenge of “scale” in justifying such sweeping speech bans, particularly in a context where Odisha state government has shut down the internet, reached out directly to Meta about its content moderation, and placed bans on freedom of assembly for a year.
The minority believes that a social media company that operates at a particular scale must ensure the application of its policies at the same scale. In addition, as the UN Special Rapporteur has noted, social media companies have a “range of options short of deletion that may be available . . . in given situations” ( A/74 /486, para. 51) (noting options such as geoblocking, reducing amplification, warning labels, promoting counter speech, etc). The Special Rapporteur has also stated “just as States should evaluate whether a limitation on speech is the least restrictive approach, so too should companies carry out this kind of evaluation. And, in carrying out the evaluation, companies should bear the burden of publicly demonstrating necessity and proportionality.” (Id., emphasis added.) The Board has previously called on Meta to explain the continuum of options it has at its disposal in achieving legitimate aims and to articulate why the selected one is the least intrusive means (See “ Claimed COVID Cure” case). For the minority, information along the lines proposed by the Special Rapporteur would be helpful in assessing whether it is necessary and proportionate to institute a sweeping mass removal of key content during a crisis. Moreover, in engaging in such a public dialogue with Meta, the company could explain in more detail to the Board and the public, particularly given its announced achievements with respect to artificial intelligence, the company’s efforts to improve its automated technologies to detect posts that may fall within its own policy exceptions.
9. Oversight Board Decision
The Oversight Board upholds Meta's decision to take down the content.
10. Recommendations
The Oversight Board decided not to issue new recommendations in this decision, given the relevance of previous recommendations issued in other cases. Therefore, the Board decides to reiterate the following recommendations, for Meta to follow closely:
- Ensure that the Violence and Incitement Community Standard allows content containing statements with “neutral reference to a potential outcome of an action or an advisory warning,” and content that “condemns or raises awareness of violent threats” (Recommendation no. 1, “Russian Poem” case). The Board expects these changes to the policy to be reflected downstream on Meta’s at scale enforcement including its detection of content for enforcement, its classifier/automated enforcement and review guidelines used by its content reviewers.
- Provide more clarity to users and explain in the landing page of the Community Standards, in the same way the company does with the newsworthiness allowance, that allowances to the Community Standards may be made when their rationale, and Meta’s values, demand a different outcome than a strict reading of the rules. The Board also recommended Meta to include a link to a Transparency Center page which provides information about the “spirit of the policy” allowance (Recommendation no. 1, “Sri Lanka Pharmaceuticals” case).
*Procedural Note:
The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. The Board was assisted by an independent research institute headquartered at the University of Gothenburg which draws on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology. Memetica, an organization that engages in open-source research on social media trends, also provided analysis. Linguistic expertise was provided by Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world.