Oversight Board Announces Two Cases Involving a Pakistani Parliament Speech and Sudan’s Rapid Support Forces Video

Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.

केस का चयन

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.

The cases that we are announcing today are:

Reporting on Pakistani Parliament Speech

2023-038-FB-MR

Case referred by Meta

Submit public comments, which can be provided anonymously, here.

In May 2023, a Pakistani news channel posted on its Facebook page a video of a politician addressing members of the country’s parliament. In the video, the politician’s speech references an ancient tradition in which people were sacrificed in the Nile River to control flooding. The politician uses the tradition as a comparison to what should happen in present-day Pakistan and says that, in a previous speech, they had stated that Pakistan will not “heal itself” until different types of public official, including the military, are hanged.

The politician then alludes to the ongoing political crisis in Pakistan, implicating themselves and their colleagues when they say, “we are all responsible for this.” Text overlaying the video and the post’s caption repeat the politician’s previous statements about hanging public officials. The caption also mentions the strong reaction the speech generated in parliament. The content has been shared about 20,000 times and has about 40,000 reactions, the majority of which are “likes.”

The content was posted in the week following the arrest of Pakistan’s former Prime Minister. The arrest prompted protests and clashes with the police, which left at least eight protestors dead and deepened the country’s political crisis. News reports said that thousands of the former Prime Minister’s supporters, including politicians, were arrested following the protests. Journalists have been targeted with arrest and mutiny charges. Elections are scheduled for February 8, 2024, having been postponed twice in 2023.

Under Meta’s Violence and Incitement policy, the company removes “statements advocating for high-severity violence.” However, it will allow potentially violating statements if shared in an awareness-raising context, including content that clearly seeks to inform others about a specific topic or issue. Meta will also permit these statements under its newsworthiness allowance, which allows otherwise violating content to remain on the company’s platforms when the public interest value outweighs the risk of harm.

Between June and September 2023, Meta’s automated systems identified the content as potentially violating 45 times. Several human reviews, including under the cross-check system and involving Meta’s regional operations team and its policy and subject-matter experts, came to different outcomes. The final review determined the content did not violate the Violence and Incitement policy, and the video remained on Facebook.

Meta referred the case to the Board, stating that it is “significant and difficult because it involves a politician’s violent speech used in a rhetorical manner, and required consideration of the context around the post to reach our decision.”

The Board selected this case because it raises relevant questions around how Meta should treat speech from politicians and any related news coverage of that speech on its platforms, particularly in the lead-up to elections. This case provides an opportunity to directly explore issues around the protection of journalism. Additionally, it falls within the Board’s strategic priority of Elections and Civic Space.

बोर्ड ऐसे पब्लिक कमेंट की सराहना करता है, जिनसे इस बारे में जानकारी मिले:

  • The political and human-rights situation in Pakistan, particularly as it relates to criticism of the government in advance of the February 2024 elections.
  • Media freedom in Pakistan, including the role that social-media platforms play in disseminating independent coverage of political events.
  • Information about government requests to remove social-media content in Pakistan and elsewhere.
  • Meta’s moderation of content featuring politicians, particularly when that content features what could be considered violent speech, but which is used in a rhetorical manner.
  • What the criteria should be for deciding when content is “newsworthy” or posted for the purpose of “awareness raising.”

अपने फ़ैसलों के तहत, बोर्ड की ओर से Meta को पॉलिसी से जुड़े सुझाव दिए जा सकते हैं. ये सुझाव बाध्यकारी नहीं होते हैं, लेकिन Meta को 60 दिनों के अंदर इन सुझावों पर अपनी राय रखनी होती है. वैसे, बोर्ड इस केस के लिए प्रासंगिक सुझाव देने वाले पब्लिक कमेंट का स्वागत करता है.

Sudan’s Rapid Support Forces Video Captive

2023-039-FB-UA

User appeal to remove content from Facebook

Submit public comments, which can be provided anonymously, here.

In August 2023, a Facebook user posted a video showing armed men in Sudan detaining a person in the back of a military vehicle. The man, who is speaking Arabic in the video and is not the user who posted the content, identifies himself as a member of the Rapid Support Forces (RSF). He goes on to claim that the detained person is a drone signaller from another country who was assisting the Sudanese Armed Forces (SAF). The caption below the video, also in Arabic, accuses the RSF’s opponents of collaborating with foreigners.

The man in the video claims they are pursuing the SAF leadership and their foreign associates in Sudan. The man states they remain loyal to their leader, Mohamed Hamdan Dagalo. The video includes derogatory remarks about foreign nationals and the leaders of other nations that are supporting the SAF.

In April 2023, fighting broke out in Sudan between the SAF and the RSF, a paramilitary group. Other groups have since joined the armed conflict, which has left thousands dead and forced more than four million people to flee. The UN has condemned the violence and warned about the devastating impact on civilians and the humanitarian situation in the country. Meta designated the RSF as a Tier 1 terrorist organization on August 11, 2023, under its Dangerous Organizations and Individuals policy. The U.S. Treasury Department sanctioned Abdelrahim Hamdan Dagalo, an RSF figurehead, on September 6, 2023. According to Meta, he is the brother of Mohamed Hamdan Dagalo, who leads the RSF.

Shortly after it was posted, several Facebook users reported the content for terrorism, hate speech and violence. Due to a low severity and a low virality score, these reports were not prioritized for human review and the content was kept up on the platform. After the Board brought the case to Meta’s attention, the company reviewed the post under its Dangerous Organizations and Individuals policy, removing it from Facebook.

The Board selected this case to assess the scope and enforcement of Meta’s Dangerous Organizations and Individuals policy in an ongoing conflict in a country where civic space is significantly restricted. This case falls within the Board’s strategic priorities, specifically Crisis and Conflict Situations.

बोर्ड ऐसे पब्लिक कमेंट की सराहना करता है, जिनसे इस बारे में जानकारी मिले:

  • Information on the RSF’s treatment of hostages or prisoners of war, and how the use of social media that identifies them impacts their safety and exposes them to degrading and humiliating treatment and public curiosity.
  • How the RSF and SAF are using social media to shape the narratives around the conflict, and whether Meta’s designation of the RSF as a dangerous organization has impacted access to information and the safety of people in Sudan.
  • How international humanitarian law (also known as the law of armed conflict) applies to Meta’s moderation of posts showing identifiable prisoners of war and hostages.
  • Meta’s enforcement of its content policies for Arabic-language content about the conflict in Sudan, in particular video posts.
  • Meta’s prioritization of content for automated and human review in conflict situations, and the principles and factors that should guide the design of operations to ensure the most harmful content is reviewed and actioned.

अपने फ़ैसलों के तहत, बोर्ड की ओर से Meta को पॉलिसी से जुड़े सुझाव दिए जा सकते हैं. ये सुझाव बाध्यकारी नहीं होते हैं, लेकिन Meta को 60 दिनों के अंदर इन सुझावों पर अपनी राय रखनी होती है. वैसे, बोर्ड इस केस के लिए प्रासंगिक सुझाव देने वाले पब्लिक कमेंट का स्वागत करता है.

पब्लिक कमेंट

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the link above. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23:59 your local time on Tuesday 23 January.

What's Next

Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.

To receive updates when the Board announces new cases or publishes decisions, sign up here.

समाचार पर लौटें