Oversight Board Announces Case About a News Documentary on Child Abuse in Pakistan

Today, the Board is announcing a new case for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case Selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.

The case that we are announcing today is:

News Documentary on Child Abuse in Pakistan

2024-001-FB-MR

Meta Referral

Note: Please be aware before reading that the following case summary includes disturbing material dealing with content about sexual violence against minors.

In January 2022, a news organization posted on its Facebook page a video of a documentary about a man convicted in a Pakistani court for committing serial murders. The content contains extensive details, in Urdu, about the crimes, which involved the sexual abuse and murder of many children in the 1990s, and his subsequent arrest and trial. The video clearly shows identifiable images of the child victims. The caption warns that the video contains interviews with people associated with the perpetrator and his crimes, and details about sexual abuse and violence. The content was viewed about 21.8 million times, received about 51,000 reactions and 5,000 comments, and was shared about 18,000 times.

After it was posted, 67 users reported the content, while Meta’s High Risk Early Review Operations (HERO) system reported it eight times due to its high virality signals. Meta’s HERO system is designed to identify potentially violating content that is predicted to have a high likelihood of going viral. Once identified by the system, the content is prioritized for human review by Meta’s staff with language, market and policy expertise.

In this case, initial automated reviews and a human reviewer concluded the content did not violate any Community Standards. Later, however, following the additional review stage, Meta decided its original decision to keep up the content was wrong. The company then removed the post for violating the Child Sexual Exploitation, Abuse and Nudity policy’s prohibition on sharing “content that identifies, or mocks alleged victims of sexual exploitation by name or image.” Meta did not apply a strike against the account of the news organization that posted the content because of the public interest and awareness-raising context of the video, and notable length of time between the content being posted and removed.

Meta referred this case to the Board, noting that it represents tensions in Meta’s values of voice, safety, privacy and dignity when content involves the sharing of imagery of child abuse victims in a documentary. Meta considers this case significant and difficult because the company has to “weigh the safety, privacy and dignity of the child victims against the fact that the footage does not emphasize the child victims’ identities, the events depicted are from over 30 years ago, and the video appears designed to raise awareness around a serial killer’s crimes and discuss issues that have high public interest value.”

The Board selected this case to assess the impact of Meta’s Child Sexual Exploitation, Abuse and Nudity Community Standard on the rights of child victims, especially in the context of reporting and when a notable passage of time has passed. It falls within the Board’s strategic priority of Treating Users Fairly.

The Board would appreciate public comments that address:

  • Media freedoms in Pakistan, in particular any legal restrictions on the press or social media in reporting on crimes against children.
  • Whether Meta’s Child Sexual Exploitation, Abuse and Nudity policy adequately protects the rights of identifiable child victims of sexual crimes and their families, as well as the rights of freedom of expression among people reporting on or raising awareness of such crimes.
  • Ethical journalism standards on reporting of sexual crimes against child victims, including historic crimes, and the inclusion of extensive details and/or victims’ names or faces in reporting that could lead to their identification.
  • Trade-offs associated with automated systems designed to detect and prioritize enforcement decisions on potentially viral content.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public Comments

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the link above. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23:59 your local time on Tuesday 12 March.

What’s Next

Over the next few weeks, Board Members will be deliberating this case. Once they have reached their final decision, we will post it on the Oversight Board website.

To receive updates when the Board announces new cases or publishes decisions, sign up here.

Return to News