Oversight Board Overturns Meta's Decision in "Video After Nigeria Church Attack" Case
December 14, 2022
The Oversight Board has overturned Meta’s decision to remove a video from Instagram showing the aftermath of a terrorist attack in Nigeria. The Board found that restoring the post with a warning screen protects victims’ privacy while allowing for discussion of events that some states may seek to suppress.
About the Case
On June 5, 2022, an Instagram user in Nigeria posted a video showing motionless, bloodied bodies on the floor. It appears to be the aftermath of a terrorist attack on a church in southwest Nigeria, in which at least 40 people were killed and many more injured. The content was posted on the same day as the attack. Comments on the post included prayers and statements about safety in Nigeria.
Meta’s automated systems reviewed the content and applied a warning screen. However, the user was not alerted as Instagram users do not receive notifications when warning screens are applied.
The user later added a caption to the video. This described the incident as “sad,” and used multiple hashtags, including references to firearms collectors, allusions to the sound of gunfire, and the live-action game “airsoft” (where teams compete with mock weapons). The user had included similar hashtags on many other posts.
Shortly after, one of Meta’s Media Matching Service banks, an “escalations bank,” identified the video and removed it. Media Matching Service banks can automatically match users’ posts to content that has previously been found violating. Content in an “escalations bank” has been found violating by Meta's specialist internal teams. Any matching content is identified and immediately removed.
The user appealed the decision to Meta and a human reviewer upheld the removal. The user then appealed to the Board.
When the Board accepted the case, Meta reviewed the content in the “escalations bank,” found it was non-violating, and removed it. However, it upheld its decision to remove the post in this case, saying the hashtags could be read as “glorifying violence and minimizing the suffering of the victims.” Meta found this violates multiple policies, including the Violent and Graphic Content policy, which prohibits sadistic remarks.
Key Findings
A majority of the Board finds that restoring this content to Instagram is consistent with Meta’s Community Standards, values and human rights responsibilities.
Nigeria is experiencing an ongoing series of terrorist attacks and the Nigerian government has suppressed coverage of some of them, though it does not appear to have done so in relation to the June 5 attack. The Board agrees that in such contexts freedom of expression is particularly important.
When the hashtags are not considered, the Board is unanimous that a warning screen should be applied to the video. This would protect the privacy of the victims, some of whose faces are visible, while respecting freedom of expression. The Board distinguishes this video from the image in the “Russian poem” case, which was significantly less graphic, where the Board found a warning screen was not required. It also distinguishes it from the footage in the “Sudan graphic video" case, which was significantly more graphic, where the Board agreed with Meta’s decision to restore the content with a warning screen, applying a “newsworthiness allowance,” which permits otherwise violating content.
A majority of the Board finds that the balance still weighs in favor of restoring the content when the hashtags are considered, as they are raising awareness and are not sadistic. Hashtags are commonly used to promote a post within a community. This is encouraged by Meta’s algorithms, so the company should be cautious in attributing ill-intent to their use. The majority notes that Meta did not find that these hashtags are used as coded mockery. Users commenting on the post appeared to understand that it was intended to raise awareness, and responses from the post’s author were sympathetic to the victims.
A minority of the Board finds that adding shooting-related hashtags to the footage appears sadistic, and could traumatize survivors or victims’ families. A warning screen would not reduce this effect. Given the context of terrorist violence in Nigeria, Meta is justified in exercising caution, particularly when victims are identifiable. The minority therefore finds this post should not be restored.
The Board finds the Violence and Graphic Content policy should be clarified. The policy prohibits “sadistic remarks,” yet the definition of that term included in the internal guidance for moderators is broader than its common usage.
The Board notes that the content was originally removed because it matched a video that had wrongly been added to the escalations bank. In the immediate aftermath of a crisis, Meta was likely attempting to ensure that violating content did not spread on its platforms. However, the company must now ensure content mistakenly removed is restored, and resulting strikes, reversed.
The Oversight Board's Decision
The Oversight Board overturns Meta's decision to remove the post and finds it should be restored to the platform with a “disturbing content” warning screen.
The Board recommends that Meta:
- Review the language in the public Violent and Graphic Content policy to ensure it aligns with the internal guidance for moderators.
- Notify Instagram users when a warning screen is applied to their content and provide the specific policy rationale for doing so.
* The full decision is currently being translated into Yoruba. The Board will publish the Yoruba translation on its website as soon as possible in 2023.
For Further Information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click here.