New Decision Highlights Lack of Clarity Around Rules on Content Containing Accusations of Blasphemy  

The Board has upheld Meta’s decision to remove a post containing an accusation of blasphemy against a political candidate. In the immediate run-up to Pakistan’s 2024 elections, there was potential for imminent harm. However, the Board finds it is not clear the relevant rule under the Coordinating Harm and Promoting Crime policy, which prevents users from revealing the identity of a person in an “outing-risk group,” extends to public figures accused of blasphemy in Pakistan or elsewhere. It is concerning this framing does not easily translate across cultures and languages, creating confusion for users trying to understand the rules. Meta should update its policy to make clear that users must not post accusations of blasphemy against identifiable individuals in locations where blasphemy is a crime and/or where there are significant safety risks to those accused. 

About the Case 

In January 2024, an Instagram user posted a six-second video of a candidate in Pakistan’s February 2024 elections giving a speech. In the clip, the candidate praises former Prime Minister Nawaz Sharif, stating that “the person after God is Nawaz Sharif.” The video had text overlay in which the user criticizes this praise for “crossing all limits of kufr,” alleging he is a non-believer according to the teachings of Islam.  

Three Instagram users reported the content the day after it was posted and a human reviewer found it did not violate Meta’s Community Standards. The users who reported the content did not appeal that decision. Several other users reported the post over the following days but Meta maintained the content did not violate its rules, following both human review and automatic closing of some reports.  

In February 2024, Meta’s High Risk Early Review Operations (HERO) system identified the content for further review based on indications it was highly likely to go viral. The content was escalated to Meta’s policy experts who removed it for violating the Coordinating Harm and Promoting Crime policy rule based on “outing.” Meta defines “outing” as “exposing the identity or locations affiliated with anyone who is alleged to be a member of an outing-risk group.” According to Meta’s internal guidance to reviewers, an outing-risk group includes people accused of blasphemy in Pakistan. When the video was flagged by HERO and removed, it had been viewed 48,000 times and shared more than 14,000 times. In March 2024, Meta referred the case to the Oversight Board.  

Offenses relating to religion are against the law in Pakistan and the country’s social media rules mandate the removal of “blasphemous” online content.  

Key Findings  

The Board finds that, given the risks associated with blasphemy accusations in Pakistan, removing the content was in line with the Coordinating Harm and Promoting Crime policy’s rationale to prevent “offline harm.”  

It is not intuitive to users that risks facing members of certain religious or belief minorities relate to “outing,” as commonly understood (in other words, risks resulting from a private status being publicly disclosed). The use of the term “outing” in this context is confusing, both in English and Urdu. Neither is it clear that people accused of blasphemy would consider themselves members of a “group” at risk of “outing,” or that politicians would fall within an “outing-risk group” for speeches given in public, especially during elections. In short, the policy simply does not make it clear to users that the video would be violating. 

Furthermore, the policy does not specify which contexts are covered by its line against outing and which groups are considered at risk. It also does not explicitly state that those accused of blasphemy are protected in locations where such accusations pose an imminent risk of harm. Meta explained that while it has an internal list of outing-risk groups, it does not publicly provide this list so that bad actors cannot get around the rules. The Board does not agree that this reason justifies the policy’s overall lack of clarity. Clearly defining outing contexts and at-risk groups would inform potential targets of blasphemy allegations that such allegations are explicitly against Meta’s rules and will be removed. This, in turn, could strengthen reporting by users accused of blasphemy in contexts where blasphemy poses legal and safety risks, including Pakistan. Greater specificity in the public rule may also lead to more accurate enforcement by human reviewers.  

The Board is also concerned that several reviewers found the content to be non-violating even though users repeatedly reported it and Meta’s internal guidance, which is clearer, explicitly includes people accused of blasphemy in Pakistan in its outing-risk groups. It was only when Meta’s HERO system identified the content, seemingly after it had gone viral, that it was escalated to internal policy experts and found to be violating. As such, Meta’s at-scale reviewers should receive more tailored training, especially in contexts like Pakistan. 

The Oversight Board’s Decision 

The Oversight Board upholds Meta’s decision to remove the content. 

The Board recommends that Meta: 

  • Update the Coordinating Harm and Promoting Crime policy to make clear that users must not post accusations of blasphemy against identifiable individuals in locations where blasphemy is a crime and/or there are significant safety risks to persons accused of blasphemy. 
  • Train at-scale reviewers covering locations where blasphemy accusations pose an imminent risk of harm to the person accused, providing them with more specific enforcement guidance to effectively identify, and consider nuance and context, in posts containing such allegations. 

For Further Information  

To read public comments for this case, click here.

Return to News