Oversight Board Overturns Meta’s Decision in Swedish Journalist Reporting Sexual Violence Against Minors Case
February 1, 2022
Please be aware before reading that the following text includes potentially sensitive material relating to content about sexual violence against minors.
The Oversight Board has overturned Meta’s decision to remove a post describing incidents of sexual violence against two minors. The Board found that the post did not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.
About the Case
In August 2019, a user in Sweden posted on their Facebook page a stock photo of a young girl sitting down with her head in her hands in a way that obscures her face. The photo has a caption in Swedish describing incidents of sexual violence against two minors. The post contains details about the rapes of two unnamed minors, specifying their ages and the municipality in which the first crime occurred. The user also details the convictions that the two unnamed perpetrators received for their crimes.
The post argues that the Swedish criminal justice system is too lenient and incentivizes crimes. The user advocates for the establishment of a sex offenders register in the country. They also provide sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by local media.
The post provides graphic details of the harmful impact of the crime on the first victim. It also includes quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms. While the user posted the content to Facebook in August 2019, Meta removed it two years later, in September 2021, under its rules on child sexual exploitation, abuse and nudity.
Key Findings
The Board finds that this post does not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The post’s precise and clinical description of the aftermath of the rape as well as inclusion of the perpetrator’s sexually explicit statement did not constitute language that sexually exploited children or depicted a minor in a “sexualized context.”
The Board also concludes that the post was not showing a minor in a “sexualized context” as the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.
The Board notes that Meta does not define key terms such as “depiction” and “sexualization” in its public-facing Community Standards. In addition, while Meta told the Board that it allows “reporting” on rape and sexual exploitation, the company does not state this in its publicly available policies or define the distinction between “depiction” and “reporting.” A recommendation, below, addresses these points.
It is troubling that, after two years, Meta removed the post from the platform without an adequate explanation as to what caused the removal. No substantive change to the policies during this period explains the removal.
The Oversight Board’s Decision
The Oversight Board overturns Meta’s decision to remove the content, and requires that the post be restored.
As a policy advisory statement, the Board recommends that Meta:
- Define graphic depiction and sexualization in the Child Sexual Exploitation, Abuse and Nudity Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standard.
- Undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.
Note: To facilitate a more comprehensive response to Board recommendations, Meta will now have 60 days instead of 30 days to respond. The Board’s Bylaws have been updated to reflect this change, and a revised version can be found here.
For Further Information:
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.