Oversight Board Announces New Cases Related to Nigeria and India
September 15, 2022
Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.
Case Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.
The cases we are announcing today are:
Video after Nigeria church attack (2022-011-IG-UA)
User appeal to restore content to Instagram
Submit public comment here.
In June 2022, an Instagram user posted a video which appears to have been filmed shortly after a mass shooting in a church in Nigeria. The video shows motionless bloodied bodies on the floor of a church. The sounds of a chaotic scene, including people wailing and screaming, can be heard in the background. After the user posted the content, one of Meta’s Media Matching Service banks for content that relates to the Violent and Graphic Content policy identified the post. These banks automatically identify images and videos that Meta has previously determined require action. Another automated system then assessed the content and applied a warning screen to the video, marking it as disturbing. The content was also reported by three users, including for depicting death and severe injury.
About a week after posting the content, the user added an English-language caption to the video. It states that the church was attacked by gunmen, that multiple people were killed, and described the shooting as sad. It then includes a series of hashtags, primarily about recreational weapons, allusions to the sound of guns firing, and military equipment and simulations. A different Meta Media Matching Service bank for the Violent and Graphic Content policy then identified and removed the post for violating the policy. Meta later explained that it considered the caption glorified violence and that it included sadistic hashtags. The reports made by users were not reviewed and were closed once the content was removed.
The user appealed, and Meta maintained its decision to remove the content. At the time of removal, the content had been viewed more than 6,000 times. The user then appealed to the Board.
In their statement to the Board, the user states that the content was to show the world what was happening in Nigeria and to raise awareness of the killing of innocent people. The user also states they do not support violence.
Under its Violent and Graphic Content policy, Meta states that it removes any content that "glorifies violence or celebrates suffering or humiliation of others" but allows graphic content "to help people raise awareness." The policy prohibits posting "videos of people or dead bodies in non-medical settings if they depict dismemberment, visible internal organs, or partially decomposed bodies." The policy also states that warning screens are applied to “imagery that shows the violent death of a person or people by accident or murder,” and that such content can only be viewed by adults over the age of 18. When posted with sadistic remarks, this imagery is removed. According to its newsworthiness allowance, Meta allows violating content on its platforms "if keeping it visible is in the public interest."
The Board would appreciate public comments that address:
- Whether Meta’s policy on Violent and Graphic Content, including its newsworthiness allowance, strikes the right balance between protecting the rights of survivors and victims (including their families and loved ones) and documenting or raising awareness of human rights abuses or violations.
- Whether and how Meta’s enforcement of the Violent and Graphic Content Community Standard varies across regions, and information about the causes and impacts of any such differences.
- Insights on the socio-political and legal context in Nigeria regarding any challenges or limitations to freedom of expression, specifically about national security and documenting and raising awareness of human rights violations.
- Insights on the role of social media globally as a resource and forum for documenting and raising awareness of human rights violations.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
India sexual harassment video (2022-012-IG-MR)
Case referred by Meta
Submit public comment here.
In March 2022, an Instagram account describing itself as a platform for Dalit perspectives posted a video from India showing a woman being assaulted by a group of men. The woman’s face is not visible. The text accompanying the video states that a “tribal woman” was sexually assaulted and harassed by a group of men in public, and that the video went viral. The account has around 30,000 followers, mostly located in India.
The content was reported by another Instagram user for sexual solicitation and sent for human review. Human reviewers determined that the content violated Meta’s Adult Sexual Exploitation policy. Under this policy, Meta removes content “that depicts, threatens or promotes sexual violence, sexual assault or sexual exploitation.” Following additional human review, Meta issued a newsworthiness allowance, restored the content and placed a warning screen on the video alerting users it may contain violent or graphic content. The warning screen prevents users under the age of 18 from viewing the content and requires all other users to click through the screen to view the video. A newsworthiness allowance permits content on Meta’s platforms that might otherwise violate its policies where the content is newsworthy and keeping it visible is in the public interest. It can only be applied by specialist teams within Meta, and not by human reviewers who review content at scale.
Meta referred this case to the Board, stating that it demonstrates the challenge in striking “the appropriate balance between allowing content that condemns sexual exploitation and the harm in allowing visual depictions of sexual harassment to remain on [its] platforms.” It states that the content was initially removed for violating the Adult Sexual Exploitation policy for depicting non-consensual sexual touching, and that “because of the graphic and harmful nature of this kind of depiction, the policy does not allow this kind of content to be shared in a condemning context.” It says it has only allowed such content “in limited circumstances, upon escalation, and on a case-by-case basis.”
The Board would appreciate public comments that address:
- Whether Meta’s policies and enforcement practices, including its newsworthiness allowance, appropriately balance protecting users from potentially harmful content and allowing users to raise awareness.
- Insights into the socio-political context affecting the treatment of Dalit and Adivasi individuals and communities, in particular women. These insights may address any relevant power dynamics, practices of physical and social segregation and discrimination, and how existing hierarchies may be reproduced digitally.
- The role of social media in raising awareness of and condemning sexual exploitation and other harmful acts against people from marginalized communities in India.
- The harm caused by allowing visual depictions of sexual harassment to remain on Meta’s platforms, even if the victims cannot be recognized or identified.
- The potential for visual depictions of violence against marginalized groups to contribute to an increase in such violence, even when shared in a condemning context.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for both cases is open for 14 days, closing at 3pm UTC, Thursday 29 September 2022.
What’s Next
In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. In addition, following the ‘Colombia police cartoon’ and ‘Mention of the Taliban in news reporting’ decisions which we published today, we expect to publish the ‘Tigray Communication Affairs Bureau’ case decision soon.
To receive updates when the Board announces new cases or publishes decisions, sign up here.