New Cases to Explore How Content Showing Terrorist Attacks Should Be Moderated
July 11, 2024
Today, the Board is announcing three new cases for consideration. As part of this, we are inviting people and organizations to submit public comments by using the button below.
Case Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.
The cases that we are announcing today are:
Footage of Moscow Terrorist Attack
2024-038-FB-UA, 2024-039-FB-UA, 2024-040-FB-UA
User appeals to restore content to Facebook
Submit a public comment using the button below
To read this announcement in Russian, click here.
Чтобы прочитать это заявление на русском языке, нажмите здесь.
These three cases concern decisions made by Meta to remove content from Facebook, which the Oversight Board intends to address together.
Three posts were shared by different users on Facebook immediately after the March 22, 2024, terrorist attack at a concert venue and shopping center in Moscow. Each contains imagery of the attack. Meta decided to remove all three posts.
In the first case, a Facebook user posted a short video clip accompanied by a caption in English. The video shows part of the attack from inside the shopping center, with the footage seemingly taken by a bystander. Armed people are shown shooting unarmed people at close range, with some victims crouching on the ground and others fleeing the building. The footage is not high resolution; the attackers and people being shot are visible but not easily identifiable, whereas others leaving the building are identifiable. In the audio, gunfire can be heard, with people screaming. The caption asked what is happening in the country and included prayers for those impacted. When Meta removed the post, it had fewer than 50 views.
In the second case, a different Facebook user posted a shorter clip of the same footage, also accompanied by a short caption in English. The caption warned viewers about the video’s content, stating there is no place in the world for terrorism, with the addition of a #terrorist hashtag. When Meta removed the post, it had fewer than 50 views.
The third case involves a post shared on a group page by one of its administrators. The group’s description expresses support for former French Presidential candidate Éric Zemmour. The post included a still image from the attack, which could have been taken from the same video, showing armed gunmen and victims. Additionally, there is a short video filmed by someone driving past the shopping center, which is on fire. The French caption included the word “Alert” alongside commentary on the attack, such as the reported number of fatalities. The caption also stated that Ukraine had said it had nothing to do with the attack, while pointing out that nobody had claimed responsibility for it. The caption concluded with a comparison to the Bataclan terrorist attack in Paris and a statement of support for the Russian people. When Meta removed the post, it had about 6,000 views.
The company removed all three posts under its Dangerous Organizations and Individuals Community Standard, which prohibits sharing third-party imagery depicting the moment of terrorist attacks on visible victims. Meta told the Board that “quickly removing [a] moment of attack imagery on visible victims promotes safety by helping to mitigate the risk of contagion and copycat attacks while disrupting the spread of perpetrator propaganda.” The company noted that such content removal also protects the dignity of victims and their families who may not want the footage published. However, Meta recognizes that removals may lead to the risk of over-enforcement on content that neutrally discusses, condemns or raises awareness of terrorist attacks. The company added that while it may keep otherwise violating content on its platforms under the newsworthiness allowance in limited cases, this allowance is rarely applied to video footage of violent events.
Meta designated the Moscow attack as a terrorist attack on March 22, 2024, although these designations are generally not made public. The content in the first and second cases were removed automatically because Meta’s subject matter experts had assessed another instance of the same video as violating, and added it to a Media Matching Service (MMS) Bank. Meta did not apply a strike for these removals. In the third case, the content was removed following human review, and Meta applied a strike that resulted in feature limits. This prevented the user from creating content on the platform, creating or joining Messenger rooms, and advertising or creating live videos.
In all three cases, the users appealed to Meta. Human reviewers found each post violating. After the Board selected these cases, Meta confirmed its decisions to remove all three posts, but removed the strike in the third case.
In their appeals to the Board, all three users emphasized the importance of informing others about real world events and underlined the posts did not seek to glorify violence or hatred.
The Board selected these cases to consider how Meta should moderate imagery of terrorist attacks on its platforms. This case falls within the Board’s strategic priority of Crisis and Conflict Situations.
The Board would appreciate public comments that address:
- Research into harms resulting from the online dissemination of terrorist attack footage, as well as the effectiveness of attempts to limit that dissemination.
- Whether Meta should distinguish video footage of terrorist attacks taken and/or shared by perpetrators from third-party footage of such events, considering its human rights responsibilities.
- The impact of prohibiting terrorist attack imagery in countries with closed media environments, in particular where the government may promote misinformation about the attacks.
- Regulatory incentives for platforms rapidly removing terrorist content and the impact of regulation on speech about terrorist attacks.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Thursday 25 July.
What’s Next
Over the next few weeks, Board Members will be deliberating these cases. Once they have reached their decision, we will post it on the Decisions page.