New Case to Focus on Accusation of Blasphemy Against a Political Candidate in Pakistan

Today, the Board is announcing a new case for consideration. As part of this, we are inviting people and organizations to submit public comments by using the button below.

Case Selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.

The case that we are announcing today is:

Pakistan Political Candidate Accused of Blasphemy

2024-031-FB-MR

Case Referred by Meta

Submit a public comment using the button below

To read this announcement in Urdu, click here.

اس اعلان کو اردو میں پڑھنے کے لیے، یہاں کلک کریں.

In January 2024, an Instagram user posted a six-second video in Urdu of a candidate for the Pakistan Muslim League (Nawaz) party in Pakistan’s February 2024 general election. The video shows the candidate saying, as part of his speech, that former Prime Minister Nawaz Sharif is “the only entity after Allah.” Text overlaying the video identifies the candidate by name and describes him as “crossing all limits of faithlessness” for his comments about the former Prime Minister, and using the term “kufr,” which can be understood as the rejection or denial of Allah and his teachings under Islam. The post has been viewed approximately 48,000 times and shared more than 14,000 times. The February elections led to Nawaz Sharif’s brother, Shehbaz Sharif, becoming Pakistan’s Prime Minister.

Within a few days of the content being posted, 15 users reported it as violating Instagram’s Community Guidelines. Meta decided the content did not violate any policy and subsequent reports were auto-closed due to prior decisions finding no violation. A few days after these initial reports, Meta’s High Risk Early Review Operations (HERO) detected the content based on signals indicating high likelihood of virality. Meta’s HERO system is designed to identify potentially violating content that is predicted to have a high likelihood of going viral. Once detected, the content was prioritized and escalated for human review by specialists with language, market and policy expertise.

A day later, following policy and subject matter experts’ additional review, Meta removed the post under the Coordinating Harm and Promoting Crime policy, which prohibits “outing” individuals by exposing the identity of anyone who is alleged to be a member of an “outing-risk group.” In internal guidance provided to reviewers “outing-risk groups" include people accused of blasphemy in Pakistan. According to Meta, the company removes these types of allegations, “regardless of whether they have been substantiated because of the significant risk of offline harm associated with them.” Blasphemy is a crime under the Pakistan Penal Code.

Meta referred the case to the Board, noting its significance and difficulty. On the one hand, Meta informed the Board that it sees public interest value in allowing criticism of politicians during an election on the platform. On the other hand, accusations of blasphemy in Pakistan can contribute to the risk of significant offline harm if left up on the platform. This case falls within the Board’s Elections and Civic Space strategic priority.

The Board would appreciate public comments that address:

  • The political situation in Pakistan around the February 2024 elections and the role of social media in electoral campaigning and discourse.
  • The environment for freedom of expression in Pakistan, in particular relating to the enforcement of blasphemy laws against political opposition, journalists and civil society.
  • The role that blasphemy accusations against public figures play in political discourse in Pakistan and other regions, the risks such allegations can pose to individuals’ safety, and Meta’s responsibilities to prevent or mitigate potential harms from such accusations while respecting freedom of expression.
  • The implications of Meta’s Coordinating Harm and Promoting Crime policy protecting the identities of people in “outing-risk groups” (i.e., to remove content accusing people of blasphemy) in certain regions while ensuring respect for freedom of expression.
  • The human rights responsibilities of companies regarding government requests for the removal of posts containing blasphemy or allegations of blasphemy on their platforms.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public Comments

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Tuesday 11 June.

What’s Next

Over the next few weeks, Board Members will be deliberating this case. Once they have reached their decision, we will post it on the Decisions page.

Return to News