Overturned

News Documentary on Child Abuse in Pakistan

The Oversight Board has overturned Meta’s decision to take down a documentary video posted by Voice of America (VOA) Urdu, revealing the identities of child victims of sexual abuse and murder from Pakistan in the 1990s. The majority find that a newsworthiness allowance should have been applied.

Type of Decision

Standard

Policies and Topics

Topic
Children / Children's rights, Journalism, Safety
Community Standard
Child nudity and sexual exploitation of children

Region/Countries

Location
Pakistan

Platform

Platform
Facebook

To read this decision in Urdu, click here.
م مکمل فیصلہ اردو میں پڑھنے کے لیے، یہاں پر کلک کریں.

Summary

The Oversight Board has overturned Meta’s decision to take down a documentary video posted by Voice of America (VOA) Urdu, revealing the identities of child victims of sexual abuse and murder from Pakistan in the 1990s. Although the Board finds the post did violate the Child Sexual Exploitation, Abuse and Nudity Community Standard, the majority find that a newsworthiness allowance should have been applied in this case. These Board Members believe the ongoing public interest in reporting on child abuse outweighs the potential harms from identification to the victims, who did not survive these crimes that took place 25 years ago. Broadly factual in nature and sensitive to the victims, VOA Urdu’s documentary could have informed public debate on the widespread issue of child sexual abuse, which is underreported in Pakistan. This case also highlights how Meta could better communicate to users which policies do and which policies do not benefit from exceptions.

About the Case

In January 2022, the broadcaster Voice of America (VOA) Urdu posted on its Facebook page an 11-minute documentary about Javed Iqbal, who murdered and sexually abused approximately 100 children in Pakistan in the 1990s. The documentary, in Urdu, includes disturbing details of the crimes and the perpetrator’s trial. There are images of newspaper clips that clearly show the faces of the child victims along with their names, while other footage of people in tears could be relatives. The post’s caption mentions that a different film about the crimes had recently been in the news, and it also warns viewers about the documentary’s contents. This post was viewed about 21.8 million times and shared about 18,000 times.

Between January 2022 and July 2023, 67 users reported the post. Following both automated and human reviews, Meta concluded the content was not violating. The post was also flagged separately by Meta’s High Risk Early Review Operations system because of its high likelihood of going viral. This led to human review by Meta’s internal staff with language, market and policy expertise (rather than by outsourced human moderation). Following escalation internally, Meta’s policy team overturned the original decision to keep the post up and removed it for violating the Child Sexual Exploitation, Abuse and Nudity policy. The company decided not to grant a newsworthiness allowance. Meta then referred this case to the Board.

Key Findings

The majority of the Board find that Meta should have applied the newsworthiness allowance to this content, keeping the post on Facebook. The Board finds the post violated the Child Sexual Exploitation, Abuse and Nudity Community Standard because the child abuse victims are identifiable by their faces and names. However, for the majority, the public interest in reporting on these child abuse crimes outweighed the possible harms to the victims and their families. In coming to their decision, the majority noted that the documentary had been produced to raise awareness, does not sensationalize the gruesome details and, significantly, the crimes took place about 25 years ago, with none of the victims surviving. This passage of time is the most important factor because it means possible direct harms to the child victims had diminished. Meanwhile, the public interest in child abuse remains.

Experts consulted by the Board confirmed that child sexual abuse is prevalent in Pakistan, but incidents are underreported. The majority took note of expert reports on Pakistan’s track record of cracking down on independent media and silencing dissent, while also failing to prevent or punish serious crimes against children. This makes social media platforms necessary for reporting on and receiving information on this issue. In this case, the VOA Urdu documentary made an important contribution to public discussions.

A minority note that while the video raised issues of public interest, it was possible for those issues to be discussed in detail without showing the names and faces of the victims, and therefore the content should have been removed.

The Board expresses alarm at the length of time (18 months) it took for Meta to finally make a decision on this content, by which time it had been viewed 21.8 million times, and questions whether Meta’s resources for Urdu-language videos are sufficient.

While the rarely used newsworthiness allowance – a general exception that can be applied only by Meta’s expert teams – was relevant here, the Board notes that no specific policy exceptions, such as raising awareness or reporting on, are available for the Child Sexual Exploitation, Abuse and Nudity policy. Meta should provide more clarity to users about this.

Additionally, it could be made clearer to people in the public language of this policy what qualifies as identifying alleged victims “by name or image.” Had VOA Urdu received a more detailed explanation of the rule it was violating, it could have reposted the documentary without the offending images or, for example, with blurred faces of the victims, if this is allowed.

The Oversight Board’s Decision

The Oversight Board overturns Meta’s decision to take down the content and requires the post to be restored.

The Board recommends that Meta:

  • Create a new section within each Community Standard describing what exceptions and allowances apply. When Meta has specific rationale for not allowing certain exceptions that apply to other policies (such as news reporting or awareness raising), Meta should include that rationale in this new section.

*Case summaries provide an overview of cases and do not have precedential value.

Full Case Decision

1. Decision Summary

The Oversight Board overturns Meta’s decision to take down a Facebook post from Voice of America Urdu’s page, showing documentary video that reveals the identities of child victims of sexual abuse and murder from Pakistan in the 1990s.

The Board finds that the post violated the text of the Child Sexual Exploitation, Abuse and Nudity Community Standard, as it “identified victims of child sexual abuse by name and image.” However, the majority of the Board find that Meta should have applied the newsworthiness allowance in this case because the current public interest in Pakistan in reporting on child abuse outweighs potential harms from identification of the victims from so long ago. A minority of the Board believe it was possible for those issues to be discussed without showing the names and faces of the victims, hence Meta’s decision to remove the post was warranted.

To better inform users when policy exceptions for awareness raising, news reporting or other justifications could be granted, Meta should create a new section within each Community Standard describing what policy exceptions and allowances apply and provide the rationale when such exceptions or allowances do not apply. This section should note that general allowances such as newsworthiness apply to all Community Standards.

2. Case Description and Background

On January 28, 2022, the broadcaster Voice of America Urdu, funded by the United States government, posted on its Facebook page an 11-minute documentary video about Javed Iqbal, who was convicted in a Pakistani court for committing serial crimes against children. The documentary contained extensive details, in Urdu, about the crimes, which involved the sexual abuse and murder of approximately 100 children in the 1990s. It also covered the perpetrator’s subsequent arrest and trial.

The video contained images of newspapers clips from 1999 showing the faces of the child victims along with their names and cities they came from. It also showed children’s photographs discovered during a search of the perpetrator’s house. Extensive details of the events and incriminating evidence found at the scene of the crimes, including vats of acid where bodies were reportedly dissolved, are depicted in the documentary. There is also footage of people in tears who could be relatives of the child victims.

The documentary mentioned that Javed Iqbal had confessed to bringing children to his home, where he sexually abused them, strangled them to death and disposed of their bodies in acid. It described his arrest, along with his young accomplice, their subsequent trials and sentences to death, and finally suicide while in custody.

The post’s caption, in Urdu, mentioned that a different film about the crimes had recently been in the news. The caption also described the severity of the crimes, warning the documentary contained details about sexual abuse and violence, including interviews with people associated with the perpetrator and his crimes.

Voice of America Urdu’s Facebook page has about 5 million followers. The content was viewed about 21.8 million times, received about 51,000 reactions and 5,000 comments, and was shared around 18,000 times. Between January 2022 and July 2023, a total of 67 users reported the content. Following both automated and outsourced human reviews during that period, Meta concluded the content was not violating.

Meta’s High Risk Early Review Operations (HERO) system also flagged the content eight times due to its high virality signals between January 2022 and July 15, 2023. The HERO system is designed to identify potentially violating content predicted to have a high likelihood of going viral. Once identified by the system, the content is prioritized for human review by Meta’s internal staff with language, market and policy expertise (as opposed to outsourced moderators reviewing content).

In late July 2023, following a report from the HERO system, the internal regional operations team within Meta escalated the content to Meta’s policy experts requesting an assessment under the newsworthiness allowance. Following this review in August 2023, the policy team overturned the original decision to keep the content up and removed it for violating the Child Exploitation, Abuse and Nudity policy. Meta did not grant a newsworthiness allowance for this content because it concluded that the potential risk of harm outweighed the public interest value. The company did not specify the nature and extent of this risk.

Meta did not apply a strike against the account of the news organization that had posted the content because of the public interest and awareness-raising context of the video, as well as the notable length of time (18 months) between the content being posted and removed.

Meta referred this case to the Board because it considered it significant and difficult as the company has to “weigh the safety, privacy and dignity of the child victims against the fact that the footage does not emphasize the child victims’ identities, the events depicted are from over 30 years ago, and the video appears designed to raise awareness around a serial killer’s crimes and discuss issues that have high public interest value.”

The Board notes the following context in reaching its decision in this case. Civic space and media freedom in Pakistan is considerably restricted. UN human rights experts and civil society organizations have highlighted that the Pakistani state has a history of curtailing media freedoms and targeting those who speak critically of the authorities with arrest and legal action. Media outlets have faced interference, withdrawal of government advertising, bans on television presenters and on broadcasting content. Likewise, online activists, dissidents and journalists are often subjected to state-sponsored threats and harassment. Independent media outlets have also documented how the Pakistani authorities makes requests for social media companies to remove content. Meta reported in the company’s Transparency Center that between June 2022 and June 2023, the company geo-blocked 7,665 posts that Pakistan’s authorities reported to Meta. Local access to the content was restricted for allegedly violating local laws, even though they did not necessarily violate Meta’s policies.

Despite written confessions reportedly mailed to the local police, the crimes Javed Iqbal committed were not seriously investigated by the authorities until Pakistani journalists who received the confession letter and investigated it published a story in Jang newspaper on December 3, 1999, with the names and photos of 57 alleged child victims, thus alerting their families and generating a public uproar about the issue. Widespread and global public coverage ensued in Pakistan and internationally about the crimes, Javed Iqbal’s confession and subsequent arrest, conviction and suicide.

Between January 2022 and January 2024, films, documentaries and media reports have re-ignited interest and fueled discussions about Javed Iqbal and his crimes. “Javed Iqbal: The Untold Story of a Serial Killer,” a film that was set to be released in January 2022, was banned for several months by Pakistan’s Central Bureau of Film Censors because, according to news reports, the title glorified Iqbal. The film was released later that year at the UK Asian Film Festival. Experts the Board consulted and independent media reported that the producers edited the film and changed its name to “Kukri” (based on Javed Iqbal’s nickname), ahead of its resubmission to the Pakistani Censor Board. The film was authorized and re-released in Pakistan in June 2023.

Child sexual abuse in Pakistan remains prevalent. According to experts the Board consulted, from 2020 to 2022 there were some 5.4 million reports of online child exploitation in Pakistan on social media, based on data gathered by the National Center for Missing and Exploited Children (NCMEC). NCMEC collects reports of Child Sexual Abuse material on U.S. based social media platforms, with 90% of these reports about content posted on Meta’s platforms. Sahil, an Islamabad-based NGO, reports that an average of 12 children per day were subjected to sexual abuse in Pakistan during the first half of 2023. Almost 75 per cent of the 2,200-plus cases from 2023 were reported from Punjab, Pakistan’s most populous province. Two other heinous cases of crimes reported in the city of Kasur involved the sexual abuse by a gang of 280 children and the murder and sexual abuse of a six-year-old child, with the media showing photographs including of her dead body.

3. Oversight Board Authority and Scope

The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1).

The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4). When Meta commits to act on recommendations, the Board monitors their implementation.

4. Sources of Authority and Guidance

The following standards and precedents informed the Board’s analysis in this case:

I. Oversight Board Decisions

II. Meta’s Content Policies

The policy rationale for the Child Sexual Exploitation, Abuse and Nudity policy states that Meta does not permit content that “sexually exploits or endangers children.” Under this policy, Meta removes “content that identifies or mocks alleged victims of sexual exploitation by name or image.”

The Board’s analysis was informed by Meta’s commitment to voice, which the company describes as “paramount,” and its values of safety, privacy and dignity.

Newsworthiness Allowance

Meta defines the newsworthiness allowance as a general policy allowance that can be applied across all policy areas within the Community Standards, including the Child Sexual Exploitation, Abuse and Nudity policy. It allows otherwise violating content to be kept on the platform if the public interest value in doing so outweighs the risk of harm. According to Meta, such assessments are made only in “rare cases,” following escalation to the Content Policy team. This team assesses whether the content in question poses an imminent threat to public health or safety or gives voice to perspectives currently being debated as part of a political process. This assessment considers country-specific circumstances, including whether elections are underway. While the speaker’s identity is a relevant consideration, the allowance is not limited to content posted by news outlets.

Meta reported that from June 1, 2022, to June 1, 2023, only 69 newsworthiness allowances were documented globally. Similar numbers were reported for the previous year.

III. Meta’s Human Rights Responsibilities

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, in which it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The following international standards may be relevant to the Board’s analysis of Meta’s human rights responsibilities in this case:

  • The rights to freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); General Comment No. 34, Human Rights Committee (2011); reports of the UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019); A/HRC/38/35 (2018); A/69/335 (2014) and A/HRC/17/27 (2011).
  • The best interest of the child: Article 3, Convention on the Rights of the Child (UNCRC); General Comment No. 25, Committee on the Rights of the Child (2021). Moreover, Article 17, UNCRC, recognizes the important function performed by the mass media on the rights of children to access information for the promotion of the child’s physical or mental health.
  • The right to privacy: Article 17, ICCPR; Article 16, UNCRC.

5. User Submissions

The author of the post was notified of the Board’s review and provided with an opportunity to submit a statement. No response was received.

6. Meta’s Submissions

According to Meta, the post violated the Child Sexual Exploitation, Abuse and Nudity Community Standard as it showed identifiable faces of child victims of sexual exploitation, together with their names. Meta defines an individual as identified through name or image if “the content includes any of the following information: (i) mention of the individual’s name (first, middle, last or full name) unless the content explicitly states that the name has been made up [or] (ii) imagery depicting the individual’s face.”

Meta distinguishes between content identifying adult victims from child victims of sexual abuse because children have “reduced capacity” to grant informed consent on identification. Given this, the risks of revictimization, community discrimination and risk of further violence remain significant for children. Meta therefore provides no policy exceptions under the Child Exploitation, Abuse and Nudity policy for content identifying alleged victims of sexual exploitation by name or image, shared for the purposes of raising awareness, reporting on or condemning the abuse.

Child rights advocates emphasized to Meta that its policies should prioritize child safety, especially in cases involving child victims of sexual assault. Other external stakeholders noted to Meta that the goal of avoiding victimization of minors has to outweigh potential newsworthiness in identifying child victims.

The Board asked Meta to study its decision not to grant the content a newsworthiness allowance in this case. Meta noted that though the content had public interest value, the risk of harm from identification of victims remained significant. Although the crimes occurred in the 1990s, the victims identified were children, and the abuses they suffered were violent and sexual in nature.

In this case, Meta did not apply a strike against the account of the news organization that posted the content because of the public interest and awareness-raising context of the video, and notable length of time between the content being posted and removed.

In response to the Board’s questions, Meta noted that the company utilizes its HERO system to proactively flag content before it reaches its peak virality using a number of different signals to identify content. This system prioritizes for review and potential action content that is likely to go viral, and it is one of many tools used to address problematic viral content on the platform.

The Board asked Meta 15 questions in writing. Questions related to Meta’s policy choices around the Child Sexual Exploitation, Abuse and Nudity policy, Meta’s strike system and the HERO system. Meta answered the 15 questions.

7. Public Comments

The Oversight Board received four public comments that met the terms for submission. Two were submitted from the United States and Canada, one from Europe and one from Asia Pacific and Oceania. To read the public comments submitted with consent to publish, please click here.

The submissions covered the following themes: the importance of protecting the privacy and identity of victims of child abuse as well as the privacy of families; the interplay between the UN Convention on the Rights of the Child and Meta’s Child Sexual Exploitation, Abuse and Nudity policy; the educational and awareness-raising context of the documentary; and the role of journalists in reporting about child abuse crimes.

8. Oversight Board Analysis

The Board accepted this Meta referral to assess the impact of Meta’s Child Sexual Exploitation, Abuse and Nudity Community Standard on the rights of child victims, especially in the context of reporting on crimes after a notable passage of time. This case concerns the protection of civic space, which is among the Board’s strategic priorities. The Board examined whether this content should be restored by analyzing Meta’s content policies, human rights responsibilities and values.

8.1 Compliance with Meta’s Content Policies

The Board agrees with Meta that the content in this case violated the explicit rules of the Child Sexual Exploitation, Abuse and Nudity Community Standard, as the video showed identifiable faces and contained the names of child abuse victims.

The majority of the Board, however, find that Meta should have applied the newsworthiness allowance and permitted the content to remain on Facebook, on escalation. For the majority, the public interest in reporting on child abuse crimes with such characteristics as in this case outweighed the possible harm to the victims and their families. This conclusion is largely based on the fact that this documentary has been produced to raise awareness, does not mock nor sensationalize the gruesome details it reports on, and, most significantly, the crimes took place almost 25 years ago and none of the victims survived.

For the majority, the passage of a significant period of time was the most important factor in this case. With the passage of time, the potential impact on the rights of children and their families may subside, while the public interest in reporting on and addressing child abuse in Pakistan is persistent. In this case, the crimes against these children took place more than 25 years ago, and all the identifiable child victims depicted in the documentary are deceased.

Child abuse has remained widespread in Pakistan (see section 2) and is the subject of significant public discourse. The majority of the Board take note of the expert reports stating that Pakistan has a track record of cracking down on independent media and silencing dissent, while also failing to prevent or punish serious crimes against children. Therefore, social media platforms are necessary for all people, including news media, to report on and receive information relating to child abuse in Pakistan. This documentary was broadly accurate and factual in nature, and sensitive to the victims. It was specifically contextualized against recent government decisions to censor a film on the topic, and therefore made an important contribution to public discussions.

A minority of the Board consider that Meta should not apply the newsworthiness allowance in this case, highlighting that the protection of the dignity and rights of the child victims as well as their families was paramount and should not be affected by the passage of time or other considerations as pointed out by the majority. The minority note that while the video raised issues of public interest, it was possible for those issues to be discussed in detail without showing the names and faces of the victims. Consequently, removing the post was in line with Meta’s values of privacy and dignity.

When conducting a newsworthiness assessment, the Board notes it is imperative that Meta considers potential adverse human rights impacts of a decision to leave up or remove a post. These considerations are outlined in the next section.

8.2 Compliance with Meta’s Human Rights Responsibilities

The majority of the Board find that removing this post was not necessary or proportionate, and restoring the post to Facebook is consistent with Meta’s human rights responsibilities.

Freedom of Expression (Article 19 ICCPR)

Article 19, para. 2 of the ICCPR provides for broad protection of political discourse and journalism (General Comment No. 34, (2011), para. 11). The UN Special Rapporteur on freedom of expression has stated that states can encourage media organizations to self-regulate the way in which they cover and involve children. Citing the set of draft guidelines and principles from the International Federation of Journalists, the UN Special Rapporteur noted that those included “provisions on avoiding the use of stereotypes and the sensational presentation of stories involving children,” (A/69/335, para. 63).

When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s voluntary human rights commitments, both in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of expression has stated, although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression,” (A/74/486, para. 41).

I. Legality (Clarity and Accessibility of the Rules)

The principle of legality under international human rights law requires rules that limit expression to be clear and publicly accessible (General Comment No. 34, para. 25). Restrictions on expression should be formulated with sufficient precision to enable individuals to regulate their conduct accordingly ( Ibid). As applied to Meta, the company should provide guidance to users about what content is permitted on the platform and what is not. Additionally, rules restricting expression “may not confer unfettered discretion on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not,” (A/HRC/38/35, para. 46).

The Board finds that the Child Sexual Exploitation, Abuse and Nudity policy, as applied to this case, is sufficiently clear to satisfy the legality requirement but that improvements could be made.

Journalists, like all users, should be provided with sufficient guidance on how to talk about challenging topics on social media platforms within the rules. It could be made clearer to people that sharing images in which the face or name of a child victim is visible is not permitted in discussion of issues around child abuse. More detailed definitions for identification through name or image are included in the internal guidelines, available only to Meta’s content reviewers. The Board urges Meta to explore providing more clarity around what precisely qualifies as identifying alleged victims “by name or image,” including whether “by name” includes partial names, and whether “image” means only showing faces, and/or allows blurring of faces.

The Board notes that Meta considered policy changes in this area but decided not to include the “awareness raising” exception under the Child Sexual Exploitation, Abuse and Nudity policy, claiming that this position was in line with the best interest of the child, stipulated in Article 3 of the UNCRC. The company noted issues with revictimization and child victims’ reduced abilities to grant informed consent to being featured or referenced in reports about child abuse. In the interest of transparency and providing clear guidance to users, the Child Sexual Exploitation, Abuse and Nudity policy should clearly state that it does not permit the identification of child victims of sexual abuse, even where the intention is to report on, raise awareness or condemn that abuse. Since many other policies include policy exceptions, Meta should not presume that silence on whether exceptions apply is sufficient notice that media reporting and advocacy may be removed unless it meets certain conditions in terms of respect for dignity and privacy. Such notice could be framed similarly to existing guidance in the policy rationale, outlining why Meta has a blanket prohibition against sharing, for example, nude images of children, even when the intent of parents of those children is innocuous.

Such an update should indicate Meta could grant a newsworthiness allowance in highly exceptional circumstances. The Board notes that Meta’s explanation of that allowance includes an example of it permitting for reasons of public interest and historical significance the “terror of war” photograph of Phan Thị Kim Phúc, sometimes referred to informally as the “napalm girl.”

The Board notes that policy exceptions and general allowances, namely the newsworthiness allowance and the spirit of the policy allowance, are distinct, but not easily distinguishable. While each Community Standard may or may not provide for certain policy exceptions, general allowances can be applied across all policy areas within the Community Standards. Therefore, to provide clear and accessible guidance to users, Meta should create a new section within each Community Standard describing what policy exceptions and general allowances apply. When Meta has specific rationale for not providing certain exceptions that apply for other policies (such as awareness raising), Meta should include that rationale in this new section. This section should note that general allowances apply to all Community Standards.

II. Legitimate Aim

Restrictions on freedom of expression must pursue a legitimate aim, which includes the protection of the rights of others and the protection of public order and national security.

In the Swedish Journalist Reporting Sexual Violence Against Minors decision, the Board concluded that the Child Sexual Exploitation, Abuse and Nudity policy aims to prevent offline harm to the rights of minors. The Board finds that Meta’s decision in this case and the policy underlying the original removal pursues the legitimate aim of protecting the rights of child victims of sexual abuse to physical and mental health (Article 17 UNCRC), and their right to privacy (Article 17 ICCPR, Article 16 UNCRC), consistent with respecting the best interests of the child (Article 3 UNCRC).

III. Necessity and Proportionality

The principle of necessity and proportionality provides that any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected,” (General Comment No. 34, paras. 33-34).

Article 3 of the UNCRC states that “in all actions concerning children, ... the best interests of the child shall be a primary consideration.” Consistent with this, UNICEF’s Guidelines for Journalists Reporting on Children note that the rights and dignity of every child should be respected in every circumstance and that the best interests of the child should be protected over any other consideration, including advocacy for children’s issues and the promotion of child rights.

The Committee on the Rights of the Child has noted that states should have regard for all children’s rights, including “to be protected from harm and to have their views given due weight,” (General Comment no. 25, para. 13). The Committee further highlighted that “privacy is vital to children’s agency, dignity and safety and for the exercise of their rights” and that “threats may arise from… a stranger sharing information about a child” online (General Comment no. 25, para. 67).

The Board underlines that Meta’s prohibition of content identifying victims of child sexual exploitation by name or image is a necessary and proportionate policy. The circumstances to depart from this rule will be exceptional and require a detailed assessment of context by subject matter experts (for analogous or additional standards regarding the consideration of exceptional circumstances when determining whether to allow the identification of persons in vulnerable situations, see Armenian Prisoners of War Video).

For the majority of the Board, Meta should have kept this content on the platform under its newsworthiness allowance. The majority highlight that leaving the content up under the newsworthiness allowance was consistent with the best interests of the children in this case, which Meta rightly identifies as a concern that should be given utmost importance.

For the majority, three key factors in combination do provide the basis for a newsworthiness allowance. The passage of time was the leading factor in this case together with the fact that all the child victims concerned were dead, thus diminishing the possible direct harm to them. Second, the sexual abuse of children is still a widely existent but underreported phenomenon in Pakistan. Third, the documentary in question does not sensationalize the issue, but raises awareness in almost an educational way and could help inform public debate on a significant human rights concern that has long beset Pakistan and other nations.

The majority of the Board note that while the images and names of the victims shown in old newspaper clippings and pictures could be blurred, removing the whole documentary against the backdrop of all the factors above is disproportionate. Instead, Meta could explore alternative measures to inform users about the relevant policy and provide technical solutions to prevent violations, as discussed below. Due to all the specific combinations of all factors outlined above, the documentary should have been given a newsworthiness allowance.

For a minority of the Board, Meta’s decision to remove this content and not apply the newsworthiness allowance was in line with Meta’s human rights responsibilities and is consistent with the best interests of the child in this case. Such reporting, the minority believe, should prioritize the dignity of child victims of abuse and ensure their privacy rights are respected regardless of the passage of time and the assumed public debate value of such content.

These Board Members highlight that when reporting on child abuse, journalists and media organizations have an ethical responsibility to follow professional codes of conduct. Given that engagement-based social media can incentivize the sensational and “click bait,” it is an appropriate mitigation for Meta to adopt strict content policies requiring media to report on sensitive matters impacting children responsibly. This would be consistent with applicable human rights standards that encourage “evidence-based reporting that does not reveal the identity of children who are victims and survivors,” (General Comment no. 25, para. 57) and that “encourage the media to provide appropriate information regarding all aspects of the ... sexual exploitation and sexual abuse of children, using appropriate terminology, while safeguarding the privacy and identity of child victims and child witnesses at all times,” ( Guidelines of the Optional Protocol to the Convention on the Rights of the Child, para. 28.f).

While the content in this case does concern a matter of public interest, the minority believe that Meta requiring a stricter adherence to the standards of journalistic ethics would allow these issues to be reported in a way that respects the dignity and rights to privacy of the victims and their families. A minority of the Board also underline that Meta’s decision not to apply a strike to the news organization’s account when it properly removed the content was proportionate.

Although the Board overturns Meta’s decision to remove this post, it remains alarmed that the company took 18 months to reach its decision on a piece of content that it finally deemed as violating despite dozens of user reports and flags from the company’s own virality prediction system. Meta should investigate the reasons for this and assess whether its systems or resources for reviewing Urdu language videos are sufficient (see Mention of the Taliban in News Reporting). Effective systems are essential to ensure that such posts, where necessary, are referred to internal teams with the expertise to assess if there is a public interest reason to keep the content on the platform. The content in this case was one such example of viral content (attracting more than 21.8 million views) that should have been detected quickly, both to prevent potential harm, but also so that a newsworthiness assessment could be conducted.

The Board also notes that had Voice of America Urdu, the media organization posting this content, received a more detailed explanation of the policy line it had violated, it could have easily reposted the content in an edited form e.g., after removing the segments with offending images or by blurring the faces of the victims. In this respect, Meta should consider providing users with more specific notifications about violations, in line with the Board’s recommendation no. 1 in Armenians in Azerbaijan and recommendation no. 2 in the Breast Cancer Symptoms and Nudity cases. Additionally, to ease the burden on users and mitigate the risk of them endangering children, Meta could explore providing users with more specific instructions or access within its products to, for instance, face-blurring tools for video so they can more easily adhere to Meta’s policies that protect the rights of children. Meta could also consider the feasibility of suspending such content for a set period of time before it is permanently removed, if not properly edited (see recommendation no. 13 in Sharing Private Residential Information policy advisory opinion). The author of the relevant content could be notified that during the suspension period they could avoid the subsequent removal of their content if they use such a tool to make the content compliant.

9. Oversight Board Decision

The Oversight Board overturns Meta’s decision to take down the content, requiring the post to be restored.

10. Recommendations

Content Policy

1. To better inform users when policy exceptions could be granted, Meta should create a new section within each Community Standard detailing what exceptions and allowances apply. When Meta has specific rationale for not allowing certain exceptions that apply to other policies (such as news reporting or awareness raising), Meta should include that rationale in this section of the Community Standard.

The Board will consider this implemented when each Community Standard includes the described section and rationales for exceptions that do and do not apply.

*Procedural Note:

The Oversight Board’s decisions are prepared by panels of five Members and approved by the majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. The Board was assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology. Memetica, an organization that engages in open-source research on social media trends, also provided analysis.

Return to Case Decisions and Policy Advisory Opinions