بوابة التعليقات العامة

Footage of Moscow Terrorist Attack

تم النشر بتاريخ 11 تَمُّوز 2024 تم تحديد الحالة
تم النشر بتاريخ 25 تَمُّوز 2024 التعليقات العامة مغلقة
الأحداث القادمة تم نشر القرار
الأحداث القادمة ميتا تنفذ القرار

Thank you for your interest in submitting a public comment. This public comment portal is now closed. Please check out the Oversight Board website to find out more about our new cases and decisions.

وصف حالة

These three cases concern decisions made by Meta to remove content from Facebook, which the Oversight Board intends to address together.

Three posts were shared by different users on Facebook immediately after the March 22, 2024, terrorist attack at a concert venue and shopping center in Moscow. Each contains imagery of the attack. Meta decided to remove all three posts.

In the first case, a Facebook user posted a short video clip accompanied by a caption in English. The video shows part of the attack from inside the shopping center, with the footage seemingly taken by a bystander. Armed people are shown shooting unarmed people at close range, with some victims crouching on the ground and others fleeing the building. The footage is not high resolution; the attackers and people being shot are visible but not easily identifiable, whereas others leaving the building are identifiable. In the audio, gunfire can be heard, with people screaming. The caption asked what is happening in the country and included prayers for those impacted. When Meta removed the post, it had fewer than 50 views.

In the second case, a different Facebook user posted a shorter clip of the same footage, also accompanied by a short caption in English. The caption warned viewers about the video’s content, stating there is no place in the world for terrorism, with the addition of a #terrorist hashtag. When Meta removed the post, it had fewer than 50 views.

The third case involves a post shared on a group page by one of its administrators. The group’s description expresses support for former French Presidential candidate Éric Zemmour. The post included a still image from the attack, which could have been taken from the same video, showing armed gunmen and victims. Additionally, there is a short video filmed by someone driving past the shopping center, which is on fire. The French caption included the word “Alert” alongside commentary on the attack, such as the reported number of fatalities. The caption also stated that Ukraine had said it had nothing to do with the attack, while pointing out that nobody had claimed responsibility for it. The caption concluded with a comparison to the Bataclan terrorist attack in Paris and a statement of support for the Russian people. When Meta removed the post, it had about 6,000 views.

The company removed all three posts under its Dangerous Organizations and Individuals Community Standard, which prohibits sharing third-party imagery depicting the moment of terrorist attacks on visible victims. Meta told the Board that “quickly removing [a] moment of attack imagery on visible victims promotes safety by helping to mitigate the risk of contagion and copycat attacks while disrupting the spread of perpetrator propaganda.” The company noted that such content removal also protects the dignity of victims and their families who may not want the footage published. However, Meta recognizes that removals may lead to the risk of over-enforcement on content that neutrally discusses, condemns or raises awareness of terrorist attacks. The company added that while it may keep otherwise violating content on its platforms under the newsworthiness allowance in limited cases, this allowance is rarely applied to video footage of violent events.

Meta designated the Moscow attack as a terrorist attack on March 22, 2024, although these designations are generally not made public. The content in the first and second cases were removed automatically because Meta’s subject matter experts had assessed another instance of the same video as violating, and added it to a Media Matching Service (MMS) Bank. Meta did not apply a strike for these removals. In the third case, the content was removed following human review, and Meta applied a strike that resulted in feature limits. This prevented the user from creating content on the platform, creating or joining Messenger rooms, and advertising or creating live videos.

In all three cases, the users appealed to Meta. Human reviewers found each post violating. After the Board selected these cases, Meta confirmed its decisions to remove all three posts, but removed the strike in the third case.

In their appeals to the Board, all three users emphasized the importance of informing others about real world events and underlined the posts did not seek to glorify violence or hatred.

The Board selected these cases to consider how Meta should moderate imagery of terrorist attacks on its platforms. This case falls within the Board’s strategic priority of Crisis and Conflict Situations.

The Board would appreciate public comments that address:

  • Research into harms resulting from the online dissemination of terrorist attack footage, as well as the effectiveness of attempts to limit that dissemination.
  • Whether Meta should distinguish video footage of terrorist attacks taken and/or shared by perpetrators from third-party footage of such events, considering its human rights responsibilities.
  • The impact of prohibiting terrorist attack imagery in countries with closed media environments, in particular where the government may promote misinformation about the attacks.
  • Regulatory incentives for platforms rapidly removing terrorist content and the impact of regulation on speech about terrorist attacks.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.