بوابة التعليقات العامة

Pakistan Political Candidate Accused of Blasphemy

تم النشر بتاريخ 28 أَيّار 2024 تم تحديد الحالة
تم النشر بتاريخ 11 حَزِيران 2024 التعليقات العامة مغلقة
تم النشر بتاريخ 19 أَيْلُول 2024 تم نشر القرار
الأحداث القادمة ميتا تنفذ القرار

تعليقات


دولة
Pakistan
لغة
English
الملفات المرفقة
Pakistan-Political-Candidate-Accused-of-Blasphemy-2024-031-FB-MR-final-copy.docx
اسم
Thiago Alves Pinto
منظمة
University of Oxford
دولة
United Kingdom
لغة
English
الملفات المرفقة
Alves-Pinto-Meta-comment-blasphemy.pdf

وصف حالة

In January 2024, an Instagram user posted a six-second video in Urdu of a candidate for the Pakistan Muslim League (Nawaz) party in Pakistan’s February 2024 general election. The video shows the candidate saying, as part of his speech, that former Prime Minister Nawaz Sharif is “the only entity after Allah.” Text overlaying the video identifies the candidate by name and describes him as “crossing all limits of faithlessness” for his comments about the former Prime Minister, and using the term “kufr,” which can be understood as the rejection or denial of Allah and his teachings under Islam. The post has been viewed approximately 48,000 times and shared more than 14,000 times. The February elections led to Nawaz Sharif’s brother, Shehbaz Sharif, becoming Pakistan’s Prime Minister.

Within a few days of the content being posted, 15 users reported it as violating Instagram’s Community Guidelines. Meta decided the content did not violate any policy and subsequent reports were auto-closed due to prior decisions finding no violation. A few days after these initial reports, Meta’s High Risk Early Review Operations (HERO) detected the content based on signals indicating high likelihood of virality. Meta’s HERO system is designed to identify potentially violating content that is predicted to have a high likelihood of going viral. Once detected, the content was prioritized and escalated for human review by specialists with language, market and policy expertise.

A day later, following policy and subject matter experts’ additional review, Meta removed the post under the Coordinating Harm and Promoting Crime policy, which prohibits “outing” individuals by exposing the identity of anyone who is alleged to be a member of an “outing-risk group.” In internal guidance provided to reviewers “outing-risk groups" include people accused of blasphemy in Pakistan. According to Meta, the company removes these types of allegations, “regardless of whether they have been substantiated because of the significant risk of offline harm associated with them.” Blasphemy is a crime under the Pakistan Penal Code.

Meta referred the case to the Board, noting its significance and difficulty. On the one hand, Meta informed the Board that it sees public interest value in allowing criticism of politicians during an election on the platform. On the other hand, accusations of blasphemy in Pakistan can contribute to the risk of significant offline harm if left up on the platform. This case falls within the Board’s Elections and Civic Space strategic priority.

The Board would appreciate public comments that address:

  • The political situation in Pakistan around the February 2024 elections and the role of social media in electoral campaigning and discourse.
  • The environment for freedom of expression in Pakistan, in particular relating to the enforcement of blasphemy laws against political opposition, journalists and civil society.
  • The role that blasphemy accusations against public figures play in political discourse in Pakistan and other regions, the risks such allegations can pose to individuals’ safety, and Meta’s responsibilities to prevent or mitigate potential harms from such accusations while respecting freedom of expression.
  • The implications of Meta’s Coordinating Harm and Promoting Crime policy protecting the identities of people in “outing-risk groups” (i.e., to remove content accusing people of blasphemy) in certain regions while ensuring respect for freedom of expression.
  • The human rights responsibilities of companies regarding government requests for the removal of posts containing blasphemy or allegations of blasphemy on their platforms.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.