OVERTURNED
2021-016-FB-FBR

Swedish journalist reporting sexual violence against minors

The Oversight Board has overturned Meta's decision to remove a post describing incidents of sexual violence against two minors.
OVERTURNED
2021-016-FB-FBR

Swedish journalist reporting sexual violence against minors

The Oversight Board has overturned Meta's decision to remove a post describing incidents of sexual violence against two minors.
Policies and topics
Children / Children's rights, Safety
Adult nudity and sexual activity
Region and countries
Europe
Sweden
Platform
Facebook
Policies and topics
Children / Children's rights, Safety
Adult nudity and sexual activity
Region and countries
Europe
Sweden
Platform
Facebook

Case summaryCase summary

Note: Please be aware before reading that the following decision includes potentially sensitive material relating to content about sexual violence against minors.

The Oversight Board has overturned Meta’s decision to remove a post describing incidents of sexual violence against two minors. The Board found that the post did not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.

About the case

In August 2019, a user in Sweden posted on their Facebook page a stock photo of a young girl sitting down with her head in her hands in a way that obscures her face. The photo has a caption in Swedish describing incidents of sexual violence against two minors. The post contains details about the rapes of two unnamed minors, specifying their ages and the municipality in which the first crime occurred. The user also details the convictions that the two unnamed perpetrators received for their crimes.

The post argues that the Swedish criminal justice system is too lenient and incentivizes crimes. The user advocates for the establishment of a sex offenders register in the country. They also provide sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by local media.

The post provides graphic details of the harmful impact of the crime on the first victim. It also includes quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms. While the user posted the content to Facebook in August 2019, Meta removed it two years later, in September 2021, under its rules on child sexual exploitation, abuse and nudity.

Key findingsKey findings

The Board finds that this post does not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The post’s precise and clinical description of the aftermath of the rape as well as inclusion of the perpetrator’s sexually explicit statement did not constitute language that sexually exploited children or depicted a minor in a “sexualized context.”

The Board also concludes that the post was not showing a minor in a “sexualized context” as the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.

The Board notes that Meta does not define key terms such as “depiction” and “sexualization” in its public-facing Community Standards. In addition, while Meta told the Board that it allows “reporting” on rape and sexual exploitation, the company does not state this in its publicly available policies or define the distinction between “depiction” and “reporting.” A recommendation, below, addresses these points.

It is troubling that, after two years, Meta removed the post from the platform without an adequate explanation as to what caused the removal. No substantive change to the policies during this period explains the removal.

The Oversight Board’s decision

The Oversight Board overturns Meta’s decision to remove the content, and requires that the post be restored.

As a policy advisory statement, the Board recommends that Meta:

  • Define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standard.
  • Undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decisionFull case decision

1. Decision summary

The Oversight Board overturns Meta’s decision to remove the content from Facebook. The post reports on the rape of two minors and uses explicit language to describe the assault and its impact on one of the survivors. Meta applied the Child Sexual Exploitation, Abuse and Nudity Community Standard to remove the post and referred the case to the Oversight Board. The Board finds the content does not violate the policy against depictions of child sexual exploitation and should be restored.

2. Case description

In August 2019, a user in Sweden posted on their Facebook Page a stock photo of a young girl sitting down with her head in her hands in a way that obscures her face with a caption in Swedish describing incidents of sexual violence against two minors using graphic language. The post contains details about the rapes of two unnamed minors, specifying their ages and the municipality in which the first crime had occurred. The user also details the convictions that the two unnamed perpetrators received for those crimes. One of those perpetrators reportedly received a non-custodial sentence as he was a minor when he committed the offence. The perpetrator in the other case was reported as having recently completed a custodial sentence for a violent crime against another woman. The user argues that the Swedish criminal justice system is too lenient and incentivizes crimes. The user advocates for the establishment of a sex offender register in the country. The user provides sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by the local media. At the time this content was posted, discussions of penalties for child sexual assault were part of the broader criminal justice reform debate in Sweden. The user’s Facebook page is dedicated to posts on child sexual abusers and calls for reforming the existing penalties for sex crimes in Sweden.

The post provides extensive and graphic details of the harmful impact of the crime on the first victim, including describing her physical and mental injuries, offline and online harassment she encountered, as well as the psychological support she received. The post also includes quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms; the post describes that the perpetrator said to his friends that “the girl was ‘tight’ and proudly showed off his bloody hands.”

The post received about two million views, 2,000 comments and 20,000 reactions. According to Meta, the post was shared on a page with privacy settings set to public, which means that anyone could view the content posted. The page has about 100,000 followers, 95% of whom are located in Sweden.

From when it was posted in August 2019 until September 1, 2021, eight users submitted feedback to flag potential Hate Speech, Violence and Incitement, and Bullying and Harassment violations. The processes for users to submit feedback on a post and those for users to report an alleged violation are different; users are given both options. Feedback sends signals to Meta that are considered in the aggregate and can influence how content is prioritized on the specific user’s feed. When a user reports a post as an alleged policy violation, the post is assessed by Meta for compliance with its policies. One user reported the post on September 5, 2019, for violating the Bullying and Harassment policy, leading to an automated review that assessed the post as non-violating and left it up. In August 2021, Meta’s technology identified the post as potentially violating. Following human review, the post was determined to violate the Child Sexual Exploitation, Abuse and Nudity policy and was removed. The content creator’s account incurred a strike resulting in two separate feature limits. One feature limit prevented the user from going live on Facebook, using ad products, and creating or joining Messenger rooms. The other, a 30-day feature limit, prevented the user from creating any new content, except for private messages. After the user appealed the decision and following additional human review, the post was not restored but the strike associated with this removal was reversed. Meta reversed the strike because the company determined that the purpose of the post was to raise awareness. Meta notes in its Transparency Center that whether the platform applies a strike “depends on the severity of the content, the context in which it was shared and when it was posted,” but it does not explicitly mention that a strike can be reversed or withheld if the purpose of posting the content is to raise awareness.

According to Meta, in 2021, it removed five pieces of content from this page, all removed for violating the Child Sexual Exploitation, Abuse and Nudity policy. Three of the removed posts were restored, following additional review which determined that the posts were removed in error. The strikes associated with these removals were reversed when the posts were restored.

When this post was removed, Meta also reduced the page’s distribution and removed it from recommendations. Meta explains, through the Transparency Center, that pages or groups that repeatedly violate their policies may be removed from recommendations and have their distribution reduced. The Transparency Center does not state how long this penalty lasts. Meta informed the Board that a page is removed from recommendations for as long as it exceeds the strike threshold. The strike threshold is three strikes for a standard violation and one strike for a severe violation (e.g., violation involving child sexual exploitation, suicide and self-harm or terrorism).

3. Authority and scope

The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1). The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4).

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The policy rationale for the Child Sexual Exploitation, Abuse and Nudity policy states that Meta does not permit content that “sexually exploits or endangers children.” Under this policy, Meta removes content that “threatens, depicts, praises, supports, provides instruction for, makes statements of intent, admits participation in or shares links of the sexual exploitation of children.” Meta also prohibits content “(including photos, videos, real-world art, digital content, and verbal depictions) that shows children in a sexualized context.” This policy also prohibits content that identifies or mocks, by name or image, alleged victims of child sexual exploitation, but does not prohibit functional identification of a minor.

II. Meta’s values

Meta’s values are outlined in the introduction to Facebook’s Community Standards. The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. [We want] people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Meta limits “Voice” in service of four other values, and three are relevant here:

“Safety”: Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.

“Privacy”: We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.

“Dignity”: We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

III. Human Rights Standards

The United Nations Guiding Principles on Business and Human Rights (UNGPs) establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it re-committed to respecting human rights in accordance with the UNGPs. The Board’s analysis in this case was informed by the following human rights standards:

  • The right to freedom of opinion and expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression, report: A/74/486 (2019); UN Special Rapporteur on freedom of opinion and expression, report: A/HRC/17/27 (2011).
  • The best interest of the child: Art. 3, Convention on the Rights of the Child ( CRC); General Comment No. 25, Committee on the Rights of the Child, 2021.
  • The right to physical and mental health: Article 12, International Covenant on Economic, Social and Cultural Rights ( ICESCR); Articles 17 and 19, CRC, on the rights of children to access information for the promotion of his or her physical and mental health, and to be protected from all forms of physical or mental violence.
  • The right to privacy: Article 17, ICCPR; Article 16, CRC; Concluding Observations, Nepal, Committee on the Rights of the Child, Sept. 21, 2005, CRC/C/15/Add.261, para. 45, 46.

5. User statement

Following Meta’s referral and the Board’s decision to accept the case, the user was sent a message notifying them of the Board’s review and providing them with an opportunity to submit a statement to the Board. The user did not submit a statement.

6. Explanation of Meta’s decision

Meta explained in its rationale that the content was removed because it violated the Community Standard on Child Sexual Exploitation, Abuse and Nudity. Meta explained that two lines made the post violative, one describing in detail the physical aftermath of the rape and the second quoting the perpetrator's sexually explicit description of the minor as “tight.” Meta referred to expert findings from a breadth of sources including the Rape, Abuse and Incest National Network (RAINN), the UK’s “2021 Tackling Child Sexual Abuse Strategy” and the EU’s “Strategy for a More Effective Fight Against Child Sexual Abuse,” as well as multiple academic articles, that allowing depictions of rape can harm victims through re-traumatization, invasion of privacy and by facilitating harassment.

Meta also explained that, while some of its policies have carve-outs to allow sharing of content that would be otherwise violating when it is posted to raise awareness or to condemn harmful actions, the challenge of “determine[ing] where the risk of [re-traumatization] begins and the benefit of raising awareness ends” led it to prohibit graphic depictions even when shared in good faith and to raise awareness. Meta states in its rationale to the Board that it does allow reporting of rape and sexual assault, without graphic depiction. Meta also explained that it defines “depiction” to include showing an image, audio, describing in words, or broadcasting.

Meta explained in its rationale that it determined that the values of "Privacy," "Safety" and "Dignity" of minors displaced the value of voice because graphic content can revictimize children. Meta also stated that although the post does not name the victim, the information provided in the post could be used to identify the victim and lead to discriminatory treatment.

Meta also explained that the Convention on the Rights of the Child (CRC) served as guidance for setting its policies and values, quoting General comment No. 25 (2021) from the UN Committee on the Rights of the Child to implement policies and practices to protect children from “recognized and emerging risks of all forms of violence in the digital environment.” Meta stated to the Board that it is the risk of revictimization that led it to determine that removal was necessary. While Meta considers applying the newsworthiness exception to graphic content when the public interest in the expression is especially strong and the risk of harm is low, in this case, Meta determined that the risk of harm outweighed the public interest value of the expression. According to Meta, Facebook has applied the newsworthiness allowance to violations of the Child Sexual Exploitation policy six times in the past year.

7. Third-party submissions

The Board received 10 public comments in this case from stakeholders including academia and civil society organizations focusing on the rights of sexual assault survivors, children’s rights and freedom of expression. Three were from Europe, two from Latin America and the Caribbean and five from the United States and Canada. The submissions cover themes including the importance of protecting the privacy of survivors; the danger of removing speech of survivors or organizations working on prevention of child sexual exploitation and abuse; the role of Meta’s platform design choices in promoting sensationalist posts; and the need for greater transparency and clarity around the platform’s content moderation system.

On November 30, 2021, a virtual roundtable took place with seven advocacy groups and organizations whose missions are to represent survivors of domestic and sexual violence against women and children. The discussion touched on a number of themes related to the case content including differentiating between what the general public might find to be graphic descriptions of a rape from actual clinical descriptions of the act and its aftermath; secondary exploitation or victimization of survivors for the purposes of soliciting or raising donations; empowering survivors by asking them what they want and obtaining informed consent when reporting on crimes committed against them; and survivor agency being of paramount importance.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looks at the question of whether content should be restored through three lenses: the Facebook Community Standards; Meta’s publicly stated values; and its human rights responsibilities. The Board concludes that the content does not violate the Facebook Community Standards and should be restored. Meta’s values and human rights responsibilities support restoring the content. The Board recommends changes in Meta’s content policies to provide a clear definition of sexualization, graphic depiction, and reporting.

8.1. Compliance with Community Standards

The Board concludes that this post does not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity, and the content should not have been removed. The Board concludes that the post’s precise and clinical description of the aftermath of the rape as well as inclusion of the perpetrator’s sexually explicit statement did not constitute language that sexually exploited children or depicted a minor in a “sexualized context.”

The Board also concludes that the post was not showing a minor in a “sexualized context” because the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor. The user replicated language used in Swedish news media outlets reporting on the testimony provided in the court cases of the rapes referred to in the post.

8.2. Compliance with Meta’s values

The Board finds that Meta’s decision to remove this post is inconsistent with its value of “Voice.” The Board agrees that the values of “Privacy,” “Safety,” and “Dignity” are of great importance when it comes to content that graphically describes the sexual exploitation of a minor. However, the Board finds the two sentences at issue did not rise to the level of content that sexually exploited children. In addition, the public interest in bringing attention to this issue and informing the public, or advocating for legal and policy reforms, are at the core of the value of “Voice.” In weighing the different values implicated in this case, the Board also notes the importance of not silencing advocates for and survivors of child sexual exploitation. The Board also recognizes that some survivors may be less likely to speak out for fear that the graphic details of the attack will go viral on the platform.

8.3. Compliance with Meta’s human rights responsibilities

The Board finds that restoring the content in this case is consistent with Meta’s human rights responsibilities.

Freedom of Expression and Article 19 of the ICCPR

Article 19 of the ICCPR provides broad protection for freedom of expression through any media and regardless of frontiers. However, the right may be restricted under certain narrow and limited conditions, known as the three-part test of legality (clarity), legitimacy, and necessity and proportionality. Although the ICCPR does not create the same obligations for Meta as it does for states, Meta has committed to respecting human rights as set out in the UNGPs. This commitment encompasses internationally recognized human rights as defined, among other instruments, by the ICCPR and the CRC. The UN Special Rapporteur on freedom of opinion and expression has suggested that Article 19, para. 3 of the ICCPR provides a useful framework to guide platforms’ content moderation practices ( A/HRC/38/35, para. 6)

I. Legality (clarity and accessibility of the rules)

The requirement of legality in international human rights law provides that any restriction on freedom of expression is: (a) sufficiently accessible so that individuals have an adequate indication on how the law limits their rights; and (b) that the law must be formulated with enough precision so that individuals can regulate their conduct.

As discussed in Section 8.1 above, the Board concludes that this post did not violate Meta’s policy on child sexual exploitation, therefore the removal was not pursuant to an applicable rule. The Board also concludes that the policy could benefit from clear definition of key terms and examples of borderline cases. The terms “depiction” and “sexualization” are not defined in the public facing Community Standards. When Meta fails to define key terms or disclose relevant exceptions, users are unable to understand how to comply with the rules.

The Board notes that Meta’s “Known Questions” and Internal Implementation Standards (IIS), which are guidelines provided to content reviewers to help them assess content that might amount to a violation of one of Facebook’s Community Standards, provide more specific criteria when it comes to what constitutes sexualization of a minor on the platform under the Child Sexual Exploitation, Abuse and Nudity policy.

Meta informed the Board through its rationale for this case that it allows “reporting” on rape and sexual exploitation but does not state this in the publicly available policies or define the distinction between “depiction” and “reporting.” The Board notes that neither the public policies nor the Known Questions and IIS address the difference between prohibited graphic depiction or sexualization of a minor and non-violating reporting on the rape and sexual exploitation of a minor.

The Board finds it troubling that the case content remained on the platform for two years and was then removed without an adequate explanation as to what triggered the removal. No substantive change to the policies during this period explains the removal. The Board asked whether sending the content for human review was triggered by a change to the classifier. Meta indicated that it was a combination of machine learning/artificial learning classifier scores (a prediction an algorithm makes about whether a specific piece of content is likely to be violative of a specific policy) and the number of views the post received over a two-week period that triggered sending the post for human review. In its response to the Board’s questions, Meta did not specify whether there was a change to its classifiers that would have determined that the content was not violating in 2019 but that its technology would flag the same content as potentially violating and worthy of sending for human review in 2021.

II. Legitimate aim

Restrictions on freedom of expression should pursue a legitimate aim, which includes the protection of the rights of others. The Board agrees that the Facebook Community Standard on Child Sexual Exploitation, Abuse and Nudity aims to prevent offline harm to the rights of minors that may be related to content on Facebook. Therefore, the restrictions in this policy aim to serve the legitimate aim of protecting the rights of children to physical and mental health (Article 12 ICESCR, Article 19 CRC), consistent with the best interests of the child (Article 3 CRC).

III. Necessity and proportionality

The principle of necessity and proportionality under international human rights law requires that restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected” (General Comment 34, para. 34). The principle of proportionality demands consideration for the form of expression at issue (General Comment 34, para. 34).

As the Board stated in case decision 2020-006-FB-FBR Section 8.3, Meta must show three things to demonstrate that it has selected the least intrusive instrument to address the legitimate aim:

(1) the best interests of the child could not be addressed through measures that do not infringe on speech,

(2) among the measures that infringe on speech, Meta has selected the least intrusive measure, and

(3) the selected measure actually helps achieve the goal and is not ineffective or counterproductive (A/74/486, para. 52).

Analyzing whether the aims could be achieved through measures that do not infringe on freedom of expression requires understanding the full breadth of choices Meta has made and options available for addressing the harm. This requires transparency to the Board on amplification and how Meta’s platform design may incentivize sensationalist content. The Board asked Meta for information or internal research on how its design choices for the Facebook platform, including its decisions or processes affecting which posts to amplify, incentivize sensationalist reporting on issues impacting children. Meta did not provide the Board a clear answer to the question or provide any research on the subject. Transparency is essential to ensure public scrutiny of Meta’s actions. The lack of detail in Meta’s response to the Board’s question or public disclosure of how the platform’s design choices on amplification impact speech frustrates the Board’s ability to fully determine the least restrictive instrument of respecting the rights of the child in accordance with their best interests.

The Board concludes that removing this content discussing sex crimes against minors, an issue of public interest and a subject of public debate, does not constitute the least intrusive instrument of promoting the rights of the child. General Comment No. 34 highlights the importance of political expression in Article 19 of the ICCPR, including the right to freedom of expression in “political discourse,” “commentary on one’s own and on public affairs,” and “discussion of human rights,” all of which would encompass the discussion of a country’s criminal justice system and reporting on its operations in specific cases.

The Board is aware of the off-platform harm to survivors of child sexual exploitation from depictions of that exploitation being available on the platform. However, the Board draws a distinction between the perpetrator's language sexualizing the child and the user’s post quoting the perpetrator for the purpose of raising awareness on an issue of public interest. The Board agrees with the input from organizations working for and with survivors of sexual exploitation on the importance of taking into consideration the need to protect survivor testimonies or other content aimed at informing the public and engaging in advocacy for reform of legal, social and cultural barriers to preventing child sexual exploitation.

The Board considered whether the use of a warning screen may be the least intrusive measure for protecting the best interests of the child. For example, the Adult Sexual Exploitation Community Standard states that warning screens are applied to content that includes narratives or statements about adult sexual exploitation that are either shared by the victim or a third party (other than the victim) that is 1) in support of the victim, 2) in condemnation of the act, or 3) for general awareness, to be determined by the context or caption. According to a blog post on Meta’s newsroom about tackling misinformation, the company stated that when a warning screen is applied to a piece of content, 95% of users do not click to view it. Because the Board does not have information on the baseline level of engagement, the Board cannot reach a conclusion about the impact of warning screens especially as applied to content reporting on child sexual exploitation.

Finally, the Board also considered the potential for offline harm when reporting includes information sufficient to identify a child. Content that may lead to functional or “jigsaw” identification of a minor who has been the victim of child sexual exploitation implicates children's rights to freedom of expression (ICCPR, Art. 19), privacy (CRC, Art. 16) and safety (CRC, Art. 19). Functional identification may occur when content provides or compiles enough discrete pieces of information to identify an individual without naming them. In this case, the Board is unable to determine whether the pieces of information provided, along with links to media reports, could increase the possibility that the victims will be identified.

Some Board Members, however, emphasized that when there is doubt about whether a specific piece of content may lead to functional identification of a child victim, Meta should err on the side of protecting the privacy and physical and mental health of the child in accordance with international human rights principles. For these Board Members, the platform’s power to amplify is a key factor in assessing whether the minor can be identified and therefore the protections afforded to children who are victims of sexual abuse.

The current Child Sexual Exploitation, Abuse and Nudity Community Standard prohibits, “content that identifies or mocks alleged victims of child sexual exploitation by name or image.” Other policies that deal with preventing the identification of a minor or a victim of a crime (e.g., Additional Protection of Minors Community Standard;The Coordinating Harm and Publicizing Crime) leave significant gaps in addressing functional identification of minors who are victims of sexual exploitation.

9. Oversight Board decision

The Oversight Board overturns Meta’s decision to remove the content and requires the post to be restored.

10. Policy advisory statement

Content Policy

  1. Meta should define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standard.
  2. Meta should undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology, also provided research.

Policies and topics
Children / Children's rights, Safety
Adult nudity and sexual activity
Region and countries
Europe
Sweden
Platform
Facebook
Policies and topics
Children / Children's rights, Safety
Adult nudity and sexual activity
Region and countries
Europe
Sweden
Platform
Facebook

Case summaryCase summary

Note: Please be aware before reading that the following decision includes potentially sensitive material relating to content about sexual violence against minors.

The Oversight Board has overturned Meta’s decision to remove a post describing incidents of sexual violence against two minors. The Board found that the post did not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.

About the case

In August 2019, a user in Sweden posted on their Facebook page a stock photo of a young girl sitting down with her head in her hands in a way that obscures her face. The photo has a caption in Swedish describing incidents of sexual violence against two minors. The post contains details about the rapes of two unnamed minors, specifying their ages and the municipality in which the first crime occurred. The user also details the convictions that the two unnamed perpetrators received for their crimes.

The post argues that the Swedish criminal justice system is too lenient and incentivizes crimes. The user advocates for the establishment of a sex offenders register in the country. They also provide sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by local media.

The post provides graphic details of the harmful impact of the crime on the first victim. It also includes quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms. While the user posted the content to Facebook in August 2019, Meta removed it two years later, in September 2021, under its rules on child sexual exploitation, abuse and nudity.

Key findingsKey findings

The Board finds that this post does not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity. The post’s precise and clinical description of the aftermath of the rape as well as inclusion of the perpetrator’s sexually explicit statement did not constitute language that sexually exploited children or depicted a minor in a “sexualized context.”

The Board also concludes that the post was not showing a minor in a “sexualized context” as the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.

The Board notes that Meta does not define key terms such as “depiction” and “sexualization” in its public-facing Community Standards. In addition, while Meta told the Board that it allows “reporting” on rape and sexual exploitation, the company does not state this in its publicly available policies or define the distinction between “depiction” and “reporting.” A recommendation, below, addresses these points.

It is troubling that, after two years, Meta removed the post from the platform without an adequate explanation as to what caused the removal. No substantive change to the policies during this period explains the removal.

The Oversight Board’s decision

The Oversight Board overturns Meta’s decision to remove the content, and requires that the post be restored.

As a policy advisory statement, the Board recommends that Meta:

  • Define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standard.
  • Undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decisionFull case decision

1. Decision summary

The Oversight Board overturns Meta’s decision to remove the content from Facebook. The post reports on the rape of two minors and uses explicit language to describe the assault and its impact on one of the survivors. Meta applied the Child Sexual Exploitation, Abuse and Nudity Community Standard to remove the post and referred the case to the Oversight Board. The Board finds the content does not violate the policy against depictions of child sexual exploitation and should be restored.

2. Case description

In August 2019, a user in Sweden posted on their Facebook Page a stock photo of a young girl sitting down with her head in her hands in a way that obscures her face with a caption in Swedish describing incidents of sexual violence against two minors using graphic language. The post contains details about the rapes of two unnamed minors, specifying their ages and the municipality in which the first crime had occurred. The user also details the convictions that the two unnamed perpetrators received for those crimes. One of those perpetrators reportedly received a non-custodial sentence as he was a minor when he committed the offence. The perpetrator in the other case was reported as having recently completed a custodial sentence for a violent crime against another woman. The user argues that the Swedish criminal justice system is too lenient and incentivizes crimes. The user advocates for the establishment of a sex offender register in the country. The user provides sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by the local media. At the time this content was posted, discussions of penalties for child sexual assault were part of the broader criminal justice reform debate in Sweden. The user’s Facebook page is dedicated to posts on child sexual abusers and calls for reforming the existing penalties for sex crimes in Sweden.

The post provides extensive and graphic details of the harmful impact of the crime on the first victim, including describing her physical and mental injuries, offline and online harassment she encountered, as well as the psychological support she received. The post also includes quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms; the post describes that the perpetrator said to his friends that “the girl was ‘tight’ and proudly showed off his bloody hands.”

The post received about two million views, 2,000 comments and 20,000 reactions. According to Meta, the post was shared on a page with privacy settings set to public, which means that anyone could view the content posted. The page has about 100,000 followers, 95% of whom are located in Sweden.

From when it was posted in August 2019 until September 1, 2021, eight users submitted feedback to flag potential Hate Speech, Violence and Incitement, and Bullying and Harassment violations. The processes for users to submit feedback on a post and those for users to report an alleged violation are different; users are given both options. Feedback sends signals to Meta that are considered in the aggregate and can influence how content is prioritized on the specific user’s feed. When a user reports a post as an alleged policy violation, the post is assessed by Meta for compliance with its policies. One user reported the post on September 5, 2019, for violating the Bullying and Harassment policy, leading to an automated review that assessed the post as non-violating and left it up. In August 2021, Meta’s technology identified the post as potentially violating. Following human review, the post was determined to violate the Child Sexual Exploitation, Abuse and Nudity policy and was removed. The content creator’s account incurred a strike resulting in two separate feature limits. One feature limit prevented the user from going live on Facebook, using ad products, and creating or joining Messenger rooms. The other, a 30-day feature limit, prevented the user from creating any new content, except for private messages. After the user appealed the decision and following additional human review, the post was not restored but the strike associated with this removal was reversed. Meta reversed the strike because the company determined that the purpose of the post was to raise awareness. Meta notes in its Transparency Center that whether the platform applies a strike “depends on the severity of the content, the context in which it was shared and when it was posted,” but it does not explicitly mention that a strike can be reversed or withheld if the purpose of posting the content is to raise awareness.

According to Meta, in 2021, it removed five pieces of content from this page, all removed for violating the Child Sexual Exploitation, Abuse and Nudity policy. Three of the removed posts were restored, following additional review which determined that the posts were removed in error. The strikes associated with these removals were reversed when the posts were restored.

When this post was removed, Meta also reduced the page’s distribution and removed it from recommendations. Meta explains, through the Transparency Center, that pages or groups that repeatedly violate their policies may be removed from recommendations and have their distribution reduced. The Transparency Center does not state how long this penalty lasts. Meta informed the Board that a page is removed from recommendations for as long as it exceeds the strike threshold. The strike threshold is three strikes for a standard violation and one strike for a severe violation (e.g., violation involving child sexual exploitation, suicide and self-harm or terrorism).

3. Authority and scope

The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1). The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4).

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The policy rationale for the Child Sexual Exploitation, Abuse and Nudity policy states that Meta does not permit content that “sexually exploits or endangers children.” Under this policy, Meta removes content that “threatens, depicts, praises, supports, provides instruction for, makes statements of intent, admits participation in or shares links of the sexual exploitation of children.” Meta also prohibits content “(including photos, videos, real-world art, digital content, and verbal depictions) that shows children in a sexualized context.” This policy also prohibits content that identifies or mocks, by name or image, alleged victims of child sexual exploitation, but does not prohibit functional identification of a minor.

II. Meta’s values

Meta’s values are outlined in the introduction to Facebook’s Community Standards. The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. [We want] people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Meta limits “Voice” in service of four other values, and three are relevant here:

“Safety”: Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.

“Privacy”: We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.

“Dignity”: We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

III. Human Rights Standards

The United Nations Guiding Principles on Business and Human Rights (UNGPs) establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it re-committed to respecting human rights in accordance with the UNGPs. The Board’s analysis in this case was informed by the following human rights standards:

  • The right to freedom of opinion and expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression, report: A/74/486 (2019); UN Special Rapporteur on freedom of opinion and expression, report: A/HRC/17/27 (2011).
  • The best interest of the child: Art. 3, Convention on the Rights of the Child ( CRC); General Comment No. 25, Committee on the Rights of the Child, 2021.
  • The right to physical and mental health: Article 12, International Covenant on Economic, Social and Cultural Rights ( ICESCR); Articles 17 and 19, CRC, on the rights of children to access information for the promotion of his or her physical and mental health, and to be protected from all forms of physical or mental violence.
  • The right to privacy: Article 17, ICCPR; Article 16, CRC; Concluding Observations, Nepal, Committee on the Rights of the Child, Sept. 21, 2005, CRC/C/15/Add.261, para. 45, 46.

5. User statement

Following Meta’s referral and the Board’s decision to accept the case, the user was sent a message notifying them of the Board’s review and providing them with an opportunity to submit a statement to the Board. The user did not submit a statement.

6. Explanation of Meta’s decision

Meta explained in its rationale that the content was removed because it violated the Community Standard on Child Sexual Exploitation, Abuse and Nudity. Meta explained that two lines made the post violative, one describing in detail the physical aftermath of the rape and the second quoting the perpetrator's sexually explicit description of the minor as “tight.” Meta referred to expert findings from a breadth of sources including the Rape, Abuse and Incest National Network (RAINN), the UK’s “2021 Tackling Child Sexual Abuse Strategy” and the EU’s “Strategy for a More Effective Fight Against Child Sexual Abuse,” as well as multiple academic articles, that allowing depictions of rape can harm victims through re-traumatization, invasion of privacy and by facilitating harassment.

Meta also explained that, while some of its policies have carve-outs to allow sharing of content that would be otherwise violating when it is posted to raise awareness or to condemn harmful actions, the challenge of “determine[ing] where the risk of [re-traumatization] begins and the benefit of raising awareness ends” led it to prohibit graphic depictions even when shared in good faith and to raise awareness. Meta states in its rationale to the Board that it does allow reporting of rape and sexual assault, without graphic depiction. Meta also explained that it defines “depiction” to include showing an image, audio, describing in words, or broadcasting.

Meta explained in its rationale that it determined that the values of "Privacy," "Safety" and "Dignity" of minors displaced the value of voice because graphic content can revictimize children. Meta also stated that although the post does not name the victim, the information provided in the post could be used to identify the victim and lead to discriminatory treatment.

Meta also explained that the Convention on the Rights of the Child (CRC) served as guidance for setting its policies and values, quoting General comment No. 25 (2021) from the UN Committee on the Rights of the Child to implement policies and practices to protect children from “recognized and emerging risks of all forms of violence in the digital environment.” Meta stated to the Board that it is the risk of revictimization that led it to determine that removal was necessary. While Meta considers applying the newsworthiness exception to graphic content when the public interest in the expression is especially strong and the risk of harm is low, in this case, Meta determined that the risk of harm outweighed the public interest value of the expression. According to Meta, Facebook has applied the newsworthiness allowance to violations of the Child Sexual Exploitation policy six times in the past year.

7. Third-party submissions

The Board received 10 public comments in this case from stakeholders including academia and civil society organizations focusing on the rights of sexual assault survivors, children’s rights and freedom of expression. Three were from Europe, two from Latin America and the Caribbean and five from the United States and Canada. The submissions cover themes including the importance of protecting the privacy of survivors; the danger of removing speech of survivors or organizations working on prevention of child sexual exploitation and abuse; the role of Meta’s platform design choices in promoting sensationalist posts; and the need for greater transparency and clarity around the platform’s content moderation system.

On November 30, 2021, a virtual roundtable took place with seven advocacy groups and organizations whose missions are to represent survivors of domestic and sexual violence against women and children. The discussion touched on a number of themes related to the case content including differentiating between what the general public might find to be graphic descriptions of a rape from actual clinical descriptions of the act and its aftermath; secondary exploitation or victimization of survivors for the purposes of soliciting or raising donations; empowering survivors by asking them what they want and obtaining informed consent when reporting on crimes committed against them; and survivor agency being of paramount importance.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looks at the question of whether content should be restored through three lenses: the Facebook Community Standards; Meta’s publicly stated values; and its human rights responsibilities. The Board concludes that the content does not violate the Facebook Community Standards and should be restored. Meta’s values and human rights responsibilities support restoring the content. The Board recommends changes in Meta’s content policies to provide a clear definition of sexualization, graphic depiction, and reporting.

8.1. Compliance with Community Standards

The Board concludes that this post does not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity, and the content should not have been removed. The Board concludes that the post’s precise and clinical description of the aftermath of the rape as well as inclusion of the perpetrator’s sexually explicit statement did not constitute language that sexually exploited children or depicted a minor in a “sexualized context.”

The Board also concludes that the post was not showing a minor in a “sexualized context” because the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor. The user replicated language used in Swedish news media outlets reporting on the testimony provided in the court cases of the rapes referred to in the post.

8.2. Compliance with Meta’s values

The Board finds that Meta’s decision to remove this post is inconsistent with its value of “Voice.” The Board agrees that the values of “Privacy,” “Safety,” and “Dignity” are of great importance when it comes to content that graphically describes the sexual exploitation of a minor. However, the Board finds the two sentences at issue did not rise to the level of content that sexually exploited children. In addition, the public interest in bringing attention to this issue and informing the public, or advocating for legal and policy reforms, are at the core of the value of “Voice.” In weighing the different values implicated in this case, the Board also notes the importance of not silencing advocates for and survivors of child sexual exploitation. The Board also recognizes that some survivors may be less likely to speak out for fear that the graphic details of the attack will go viral on the platform.

8.3. Compliance with Meta’s human rights responsibilities

The Board finds that restoring the content in this case is consistent with Meta’s human rights responsibilities.

Freedom of Expression and Article 19 of the ICCPR

Article 19 of the ICCPR provides broad protection for freedom of expression through any media and regardless of frontiers. However, the right may be restricted under certain narrow and limited conditions, known as the three-part test of legality (clarity), legitimacy, and necessity and proportionality. Although the ICCPR does not create the same obligations for Meta as it does for states, Meta has committed to respecting human rights as set out in the UNGPs. This commitment encompasses internationally recognized human rights as defined, among other instruments, by the ICCPR and the CRC. The UN Special Rapporteur on freedom of opinion and expression has suggested that Article 19, para. 3 of the ICCPR provides a useful framework to guide platforms’ content moderation practices ( A/HRC/38/35, para. 6)

I. Legality (clarity and accessibility of the rules)

The requirement of legality in international human rights law provides that any restriction on freedom of expression is: (a) sufficiently accessible so that individuals have an adequate indication on how the law limits their rights; and (b) that the law must be formulated with enough precision so that individuals can regulate their conduct.

As discussed in Section 8.1 above, the Board concludes that this post did not violate Meta’s policy on child sexual exploitation, therefore the removal was not pursuant to an applicable rule. The Board also concludes that the policy could benefit from clear definition of key terms and examples of borderline cases. The terms “depiction” and “sexualization” are not defined in the public facing Community Standards. When Meta fails to define key terms or disclose relevant exceptions, users are unable to understand how to comply with the rules.

The Board notes that Meta’s “Known Questions” and Internal Implementation Standards (IIS), which are guidelines provided to content reviewers to help them assess content that might amount to a violation of one of Facebook’s Community Standards, provide more specific criteria when it comes to what constitutes sexualization of a minor on the platform under the Child Sexual Exploitation, Abuse and Nudity policy.

Meta informed the Board through its rationale for this case that it allows “reporting” on rape and sexual exploitation but does not state this in the publicly available policies or define the distinction between “depiction” and “reporting.” The Board notes that neither the public policies nor the Known Questions and IIS address the difference between prohibited graphic depiction or sexualization of a minor and non-violating reporting on the rape and sexual exploitation of a minor.

The Board finds it troubling that the case content remained on the platform for two years and was then removed without an adequate explanation as to what triggered the removal. No substantive change to the policies during this period explains the removal. The Board asked whether sending the content for human review was triggered by a change to the classifier. Meta indicated that it was a combination of machine learning/artificial learning classifier scores (a prediction an algorithm makes about whether a specific piece of content is likely to be violative of a specific policy) and the number of views the post received over a two-week period that triggered sending the post for human review. In its response to the Board’s questions, Meta did not specify whether there was a change to its classifiers that would have determined that the content was not violating in 2019 but that its technology would flag the same content as potentially violating and worthy of sending for human review in 2021.

II. Legitimate aim

Restrictions on freedom of expression should pursue a legitimate aim, which includes the protection of the rights of others. The Board agrees that the Facebook Community Standard on Child Sexual Exploitation, Abuse and Nudity aims to prevent offline harm to the rights of minors that may be related to content on Facebook. Therefore, the restrictions in this policy aim to serve the legitimate aim of protecting the rights of children to physical and mental health (Article 12 ICESCR, Article 19 CRC), consistent with the best interests of the child (Article 3 CRC).

III. Necessity and proportionality

The principle of necessity and proportionality under international human rights law requires that restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected” (General Comment 34, para. 34). The principle of proportionality demands consideration for the form of expression at issue (General Comment 34, para. 34).

As the Board stated in case decision 2020-006-FB-FBR Section 8.3, Meta must show three things to demonstrate that it has selected the least intrusive instrument to address the legitimate aim:

(1) the best interests of the child could not be addressed through measures that do not infringe on speech,

(2) among the measures that infringe on speech, Meta has selected the least intrusive measure, and

(3) the selected measure actually helps achieve the goal and is not ineffective or counterproductive (A/74/486, para. 52).

Analyzing whether the aims could be achieved through measures that do not infringe on freedom of expression requires understanding the full breadth of choices Meta has made and options available for addressing the harm. This requires transparency to the Board on amplification and how Meta’s platform design may incentivize sensationalist content. The Board asked Meta for information or internal research on how its design choices for the Facebook platform, including its decisions or processes affecting which posts to amplify, incentivize sensationalist reporting on issues impacting children. Meta did not provide the Board a clear answer to the question or provide any research on the subject. Transparency is essential to ensure public scrutiny of Meta’s actions. The lack of detail in Meta’s response to the Board’s question or public disclosure of how the platform’s design choices on amplification impact speech frustrates the Board’s ability to fully determine the least restrictive instrument of respecting the rights of the child in accordance with their best interests.

The Board concludes that removing this content discussing sex crimes against minors, an issue of public interest and a subject of public debate, does not constitute the least intrusive instrument of promoting the rights of the child. General Comment No. 34 highlights the importance of political expression in Article 19 of the ICCPR, including the right to freedom of expression in “political discourse,” “commentary on one’s own and on public affairs,” and “discussion of human rights,” all of which would encompass the discussion of a country’s criminal justice system and reporting on its operations in specific cases.

The Board is aware of the off-platform harm to survivors of child sexual exploitation from depictions of that exploitation being available on the platform. However, the Board draws a distinction between the perpetrator's language sexualizing the child and the user’s post quoting the perpetrator for the purpose of raising awareness on an issue of public interest. The Board agrees with the input from organizations working for and with survivors of sexual exploitation on the importance of taking into consideration the need to protect survivor testimonies or other content aimed at informing the public and engaging in advocacy for reform of legal, social and cultural barriers to preventing child sexual exploitation.

The Board considered whether the use of a warning screen may be the least intrusive measure for protecting the best interests of the child. For example, the Adult Sexual Exploitation Community Standard states that warning screens are applied to content that includes narratives or statements about adult sexual exploitation that are either shared by the victim or a third party (other than the victim) that is 1) in support of the victim, 2) in condemnation of the act, or 3) for general awareness, to be determined by the context or caption. According to a blog post on Meta’s newsroom about tackling misinformation, the company stated that when a warning screen is applied to a piece of content, 95% of users do not click to view it. Because the Board does not have information on the baseline level of engagement, the Board cannot reach a conclusion about the impact of warning screens especially as applied to content reporting on child sexual exploitation.

Finally, the Board also considered the potential for offline harm when reporting includes information sufficient to identify a child. Content that may lead to functional or “jigsaw” identification of a minor who has been the victim of child sexual exploitation implicates children's rights to freedom of expression (ICCPR, Art. 19), privacy (CRC, Art. 16) and safety (CRC, Art. 19). Functional identification may occur when content provides or compiles enough discrete pieces of information to identify an individual without naming them. In this case, the Board is unable to determine whether the pieces of information provided, along with links to media reports, could increase the possibility that the victims will be identified.

Some Board Members, however, emphasized that when there is doubt about whether a specific piece of content may lead to functional identification of a child victim, Meta should err on the side of protecting the privacy and physical and mental health of the child in accordance with international human rights principles. For these Board Members, the platform’s power to amplify is a key factor in assessing whether the minor can be identified and therefore the protections afforded to children who are victims of sexual abuse.

The current Child Sexual Exploitation, Abuse and Nudity Community Standard prohibits, “content that identifies or mocks alleged victims of child sexual exploitation by name or image.” Other policies that deal with preventing the identification of a minor or a victim of a crime (e.g., Additional Protection of Minors Community Standard;The Coordinating Harm and Publicizing Crime) leave significant gaps in addressing functional identification of minors who are victims of sexual exploitation.

9. Oversight Board decision

The Oversight Board overturns Meta’s decision to remove the content and requires the post to be restored.

10. Policy advisory statement

Content Policy

  1. Meta should define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standard.
  2. Meta should undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology, also provided research.