केस विवरण
These three cases concern decisions made by Meta to remove content from Facebook, which the Oversight Board intends to address together.
Three posts were shared by different users on Facebook immediately after the March 22, 2024, terrorist attack at a concert venue and shopping center in Moscow. Each contains imagery of the attack. Meta decided to remove all three posts.
In the first case, a Facebook user posted a short video clip accompanied by a caption in English. The video shows part of the attack from inside the shopping center, with the footage seemingly taken by a bystander. Armed people are shown shooting unarmed people at close range, with some victims crouching on the ground and others fleeing the building. The footage is not high resolution; the attackers and people being shot are visible but not easily identifiable, whereas others leaving the building are identifiable. In the audio, gunfire can be heard, with people screaming. The caption asked what is happening in the country and included prayers for those impacted. When Meta removed the post, it had fewer than 50 views.
In the second case, a different Facebook user posted a shorter clip of the same footage, also accompanied by a short caption in English. The caption warned viewers about the video’s content, stating there is no place in the world for terrorism, with the addition of a #terrorist hashtag. When Meta removed the post, it had fewer than 50 views.
The third case involves a post shared on a group page by one of its administrators. The group’s description expresses support for former French Presidential candidate Éric Zemmour. The post included a still image from the attack, which could have been taken from the same video, showing armed gunmen and victims. Additionally, there is a short video filmed by someone driving past the shopping center, which is on fire. The French caption included the word “Alert” alongside commentary on the attack, such as the reported number of fatalities. The caption also stated that Ukraine had said it had nothing to do with the attack, while pointing out that nobody had claimed responsibility for it. The caption concluded with a comparison to the Bataclan terrorist attack in Paris and a statement of support for the Russian people. When Meta removed the post, it had about 6,000 views.
The company removed all three posts under its Dangerous Organizations and Individuals Community Standard, which prohibits sharing third-party imagery depicting the moment of terrorist attacks on visible victims. Meta told the Board that “quickly removing [a] moment of attack imagery on visible victims promotes safety by helping to mitigate the risk of contagion and copycat attacks while disrupting the spread of perpetrator propaganda.” The company noted that such content removal also protects the dignity of victims and their families who may not want the footage published. However, Meta recognizes that removals may lead to the risk of over-enforcement on content that neutrally discusses, condemns or raises awareness of terrorist attacks. The company added that while it may keep otherwise violating content on its platforms under the newsworthiness allowance in limited cases, this allowance is rarely applied to video footage of violent events.
Meta designated the Moscow attack as a terrorist attack on March 22, 2024, although these designations are generally not made public. The content in the first and second cases were removed automatically because Meta’s subject matter experts had assessed another instance of the same video as violating, and added it to a Media Matching Service (MMS) Bank. Meta did not apply a strike for these removals. In the third case, the content was removed following human review, and Meta applied a strike that resulted in feature limits. This prevented the user from creating content on the platform, creating or joining Messenger rooms, and advertising or creating live videos.
In all three cases, the users appealed to Meta. Human reviewers found each post violating. After the Board selected these cases, Meta confirmed its decisions to remove all three posts, but removed the strike in the third case.
In their appeals to the Board, all three users emphasized the importance of informing others about real world events and underlined the posts did not seek to glorify violence or hatred.
The Board selected these cases to consider how Meta should moderate imagery of terrorist attacks on its platforms. This case falls within the Board’s strategic priority of Crisis and Conflict Situations.
The Board would appreciate public comments that address:
- Research into harms resulting from the online dissemination of terrorist attack footage, as well as the effectiveness of attempts to limit that dissemination.
- Whether Meta should distinguish video footage of terrorist attacks taken and/or shared by perpetrators from third-party footage of such events, considering its human rights responsibilities.
- The impact of prohibiting terrorist attack imagery in countries with closed media environments, in particular where the government may promote misinformation about the attacks.
- Regulatory incentives for platforms rapidly removing terrorist content and the impact of regulation on speech about terrorist attacks.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
टिप्पणियाँ
Meta Company does not work to realize freedom of expression, but to create confusion in countries by supporting ideas that oppose the generally accepted views of the countries. If he had been consistent in his principles, he could have explained two different decisions regarding the same photograph.
Because you are a liar. Your so-called Community Standards lists are also lies.
You censor those who share the generally accepted thoughts of society.
Both lying and censorship are the work of fascists. You are the fascist grandchildren of fascist victims.
Despite all your education, you are primitive people whose world of thought has never changed.
Kind regards...
I support the terrorist attack in Moscow.
Generally, terrorist attacks are in most cases accompanied by very sensitive images as well as videos that could be unpleasant for users and viewers. Hence, our suggestions are;
* In reporting attacks of such magnitude on Meta, users should be made to go through some compliance process that ensures that the report is real and consents to liability if found out to be fake.
* Users should always insert a redirect link on such posts. Reason can be found on the third suggestion.
* Such videos/images instead of being brought down should be blurred and a redirect link attached to the posts that will enable other Meta users who are interested in watching the content to further access it. This is obtainable on Instagram and should also be incorporated on Facebook.
* A periodic user friendly survey can also be adopted to learn what is useful to every user and what is not, then leverage algorithm to ensure that every user is presented with posts relevant to them and their interest. It can also be research/analytic tool to understand the interest of users and what information is most patronized.
It is reprehensible to publish suffering.
Those who enjoy images of tragedy have a name, Sadists.
Allowing photographs of these horrific events to be disseminated around the world is immoral and screams of the media outlets need to increase the graphic and horrific nature of human suffering continually.