Multiple Case Decision
Gender identity and nudity
The Oversight Board has overturned Meta's original decisions to remove two Instagram posts depicting transgender and non-binary people with bare chests.
2 cases included in this bundle
IG-AZHWJWBW
Case about sexual solicitation on Instagram
IG-PAVVDAFF
Case about sexual solicitation on Instagram
Case summary
The Oversight Board has overturned Meta’s original decisions to remove two Instagram posts depicting transgender and non-binary people with bare chests. It also recommends that Meta change its Adult Nudity and Sexual Activity Community Standard so that it is governed by clear criteria that respect international human rights standards.
About the case
In this decision, the Oversight Board considers two cases together for the first time. Two separate pieces of content were posted by the same Instagram account, one in 2021, the other in 2022. The account is maintained by a US-based couple who identify as transgender and non-binary.
Both posts feature images of the couple bare-chested with the nipples covered. The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.
Following a series of alerts by Meta’s automated systems and reports from users, the posts were reviewed multiple times for potential violations of various Community Standards. Meta ultimately removed both posts for violating the Sexual Solicitation Community Standard, seemingly because they contain breasts and a link to a fundraising page.
The users appealed to Meta and then to the Board. After the Board accepted the cases, Meta found it had removed the posts in error and restored them.
Key findings
The Oversight Board finds that removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities. These cases also highlight fundamental issues with Meta’s policies.
Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.
In at least one of the cases, the post was sent for human review by an automated system trained to enforce the Adult Nudity and Sexual Activity Community Standard. This Standard prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery.
This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.
The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.
The Board has consistently said Meta must be sensitive to how its policies impact people subject to discrimination (see for example, the “ Wampum belt” and “ Reclaiming Arabic words” decisions). Here, the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies.
Meta should seek to develop and implement policies that address all these concerns. It should change its approach to managing nudity on its platforms by defining clear criteria to govern the Adult Nudity and Sexual Activity policy, which ensure all users are treated in a manner consistent with human rights standards. It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard.
The Oversight Board's decision
The Oversight Board overturns Meta's original decision to remove the posts.
The Board also recommends that Meta:
- Define clear, objective, rights-respecting criteria to govern its Adult Nudity and Sexual Activity Community Standard, so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender. Meta should first conduct a comprehensive human rights impact assessment on such a change, engaging diverse stakeholders, and create a plan to address any harms identified.
- Provide more detail in its public-facing Sexual Solicitation Community Standard on the criteria that lead to content being removed.
- Revise its guidance for moderators on the Sexual Solicitation Community Standard so that it more accurately reflects the public rules on the policy. This would help to reduce enforcement errors on Meta’s part.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
1. Decision summary
The Oversight Board overturns Meta’s original decisions in two cases of Instagram posts removed by Meta. Meta has acknowledged that its original decisions in both cases were wrong. These cases raise important concerns about how Meta’s policies disproportionately impact the expressive rights of both women and LGBTQI+ users of its platforms. The Board recommends that Meta should define clear, objective, rights-respecting criteria to govern the entirety of its Adult Nudity and Sexual Activity policy, ensuring equal treatment of all people that is consistent with international human rights standards, and avoids discrimination on the basis of sex or gender identity. Meta should first conduct a comprehensive human rights impact assessment to review the implications of the adoption of such criteria, which includes broadly inclusive stakeholder engagement across diverse ideological, geographic and cultural contexts. To the degree that this assessment should identify any potential harms, implementation of the new policy should include a mitigation plan for addressing them.
The Board further recommends that Meta clarify its public-facing Sexual Solicitation policy and narrow its internal enforcement guidance to better target such violations.
2. Case description and background
These cases concern two content decisions made by Meta, which the Oversight Board is addressing together in this decision. Two separate images with captions were posted on Instagram by the same account which is jointly maintained by a US-based couple. Both images feature the couple, who stated in the posts, and in their submissions to the Board, that they identify as transgender and non-binary.
Meta removed both posts under the Sexual Solicitation Community Standard. In both cases, Meta’s automated systems identified the content as potentially violating.
In the first image, posted in 2021, both people are bare-chested and have flesh-colored tape covering their nipples. In the second image, posted in 2022, one person is clothed while the other person is bare-chested and covering their nipples with their hands. The captions accompanying these images discuss how the person who is bare-chested in both pictures will soon undergo top surgery-gender affirming surgery that creates a flatter chest. They describe their plans to document the surgery process and discuss transgender healthcare issues. They announce that they are holding a fundraiser in order to pay for the surgery because they have had difficulty securing insurance coverage for the procedure.
In the first case, the image was first automatically classified as unlikely to be violating. The report was closed without being reviewed and the content initially remained on the platform. Three users then reported the content for pornography and self-harm. These reports were reviewed by human moderators who found the post to be non-violating. When the content was reported by a user for a fourth time, another human reviewer found that the post violated the Sexual Solicitation Community Standard and removed it.
In the second case, the post was identified twice by Meta’s automated systems and then sent for human review where it was found to be non-violating both times. Two users then reported the content, but each report was closed automatically without being reviewed by a human and the content remained on Instagram. Finally, Meta's automated systems identified the content a third time and sent it for human review. The last two times, Meta’s automated Adult Nudity and Sexual Activity classifier flagged the content, but the reason for these repeated reviews is unclear. This final human reviewer found the post violated the Sexual Solicitation Community Standard and removed it.
The account owners appealed both removal decisions to Meta, and the content was reviewed by human reviewers in both cases. However, these reviews did not lead to Meta restoring the posts. The account owners then appealed both removal decisions to the Board. The Board is considering these two cases together, a first for the Board. The benefits of doing this are to identify similar issues in Meta’s content policies and processes and offer solutions that address these problems.
After the Board selected these posts and Meta was asked to provide a justification for its decision to remove the content, Meta identified the removals as "enforcement errors" and restored the posts.
When considering why these cases represent important issues, the Board notes as relevant context the high volume of public comments that were received in these cases, many of which were from people who identified as trans, non-binary, or cis-gender women who explained that they were personally affected by enforcement errors and issues similar to those present in these cases.
The Board has also noted as relevant context the academic research, also cited in public comments, by Haimson et al, Witt, Suzor and Huggins, and two reports by Salty on algorithmic bias and censorship of marginalized communities. These studies found that enforcement errors in the two Community Standards discussed in these cases disproportionately affect women and the LGBTQI+ community. A co-author of one of these studies is a member of the Oversight Board.
3. Oversight Board authority and scope
The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with recommendations to which Meta must respond (Charter Article 3, Section 4; Article 4).
When the Board selects cases like this one, where Meta acknowledges that it made an error after the Board identifies the case, the Board reviews the original decision. This is to increase understanding of the policy parameters and content moderation processes that contributed to the error and to address issues the Board identifies with the underlying policies. The Board also aims to make recommendations to lessen the likelihood of future errors and treat users more fairly moving forward.
When the Board identifies cases that raise similar issues, they may be assigned to a panel simultaneously to deliberate together. A binding decision will be made in respect of each piece of content.
4. Sources of authority
The Oversight Board considered the following authorities and standards:
I. Oversight Board decisions:
- “Reclaiming Arabic words” decision ( 2022-003-IG-UA). The Board analyzed the challenges of applying policy exceptions and the disproportionate impacts of some policy choices.
- “Wampum belt” decision ( 2021-012-FB-UA). The Board analyzed the challenges of applying policy exceptions and the disproportionate impacts of some policy choices.
- “Breast cancer symptoms and nudity" ( 2020-004-IG-UA). The Board analyzed the Instagram Community Guidelines and recommended that Meta clarify that there is a breast cancer awareness exception.
II. Meta’s content policies:
These cases involve Instagram's Community Guidelines and Facebook's Community Standards. Meta's Transparency Centre states that "Facebook and Instagram share Content Policies. This means that if content is considered violating on Facebook, it is also considered violating on Instagram.”
Sexual Solicitation
Instagram’s Community Guidelines state that “offering sexual services” is not allowed. This provision then links to Facebook’s Community Standard on Sexual Solicitation.
In the policy rationale on Sexual Solicitation, Meta states: “We draw the line, however, when content facilitates, encourages or coordinates sexual encounters or commercial sexual services between adults. We do this to avoid facilitating transactions that may involve trafficking, coercion and non-consensual sexual acts. We also restrict sexually explicit language that may lead to sexual solicitation because some audiences within our global community may be sensitive to this type of content, and it may impede the ability for people to connect with their friends and the broader community.”
Facebook’s Community Standard on Sexual Solicitation states that Meta prohibits both explicit and implicit solicitation. Implicit solicitation has two criteria, both of which must be met for content to violate the policy. The first is “offer or ask” which is “Content that implicitly or indirectly (typically through providing a method of contact) offers or asks for sexual solicitation.” The second criterion is “suggestive elements” which is “Content that makes the aforementioned offer or ask using one of the following sexually suggestive elements.” The elements listed include “regional sexualized slang” and “poses.”
Adult Nudity and Sexual Activity
Instagram's Community Guidelines state that users should: “Post photos and videos that are appropriate for a diverse audience. We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don't allow nudity on Instagram. This includes photos, videos and some digitally-created content that show sexual intercourse, genitals and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest are allowed.” This section links to Facebook’s Adult Nudity and Sexual Activity policy, which provides more detail on these rules.
As part of the policy rationale of the Adult Nudity and Sexual Activity Community Standard, Meta explains that: “We restrict the display of nudity or sexual activity because some people in our community may be sensitive to this type of content. Additionally, we default to removing sexual imagery to prevent the sharing of non-consensual or underage content.”
Facebook’s Adult Nudity and Sexual Activity policy also states: “Do not post: Uncovered female nipples except in the context of breastfeeding, birth giving and after-birth moments, medical or health context (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest.” Users can also post imagery of genitalia when shared in a “medical or health context” (which includes gender confirmation surgery) but a label will be applied warning people that the content is sensitive. There are also at least 18 additional internal guidance factors about nipples and these exceptions.
III. Meta’s values:
Meta's values are outlined in the introduction to the Facebook Community Standards where the value of "Voice" is described as "paramount":
The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Meta limits "Voice" in service of four values, two of which are relevant here:
"Safety": We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.
"Dignity": We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade them.
IV. International human rights standards:
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The Board's analysis of Meta’s human rights responsibilities in these cases were informed by the following human rights standards:
- The rights to freedom of opinion and expression: Article 19, International Covenant on Civil and Political Rights (ICCPR), General Comment No. 34, Human Rights Committee, 2011; Communication 488/1992; Resolution 32/2 Human Rights Council, 2016; UN Special Rapporteur on freedom of opinion and expression, reports: A/HRC/38/35 (2018) and A/74/486 (2019).
- The rights of women: Article 2 and Article 5, Convention on the Elimination of All Forms of Discrimination against Women (CEDAW).
- The right to non-discrimination: Article 2, para. 1 and Article 26, ICCPR. Nepomnyashchiy v Russia, Human Rights Committee, 2018 ( CCPR/C/123/D/2318/2013).
5. User submissions
In their submissions for these cases, the users state that they believe this content was removed because of transphobia. They write that if the Board were to affirm that this content should remain on the platform, that the decision would contribute to making Instagram a more hospitable space for LGBTQI+ expression.
6. Meta’s submissions
Meta explained in its decision rationale that both content removals were enforcement errors and that neither post violated its Sexual Solicitation policies. Meta states: “the only offer or ask is for donations to a fundraiser or to visit a website to buy t-shirts, neither of which relates to sexual solicitation.”
Meta also states that neither post violates its Adult Nudity and Sexual Activity Standard. The rationale states that the internal “reviewer guidance specifically addresses how to action on non-binary, gender neutral, or transgender nudity.” The content in these cases was shared in an “explicitly non-binary or transgender context as evidenced by the overall topic of the content (undergoing top surgery) and the hashtags used.” Meta concluded that "even if the nipples in these cases were visible and uncovered, they would not violate our Adult Nudity and Sexual Activity policy." Meta also acknowledged that in both images, the nipples are “fully obscured.”
Given the time elapsed since the content was removed, Meta could not tell the Board what policy or policies all the various automated systems that identified the content as potentially violating were programmed to enforce. In one case, Meta was able to explain that the content was enqueued for review twice by Adult Nudity and Sexual Activity classifiers. Meta could also not provide any explanation as to why the reviewers thought the content violated the Sexual Solicitation policy. The rationale acknowledges that Meta is “aware that some content reviewers may incorrectly remove content as implicit sexual solicitation (even though it is not) based on an overly-technical application of our internal reviewer guidance.”
The Board asked Meta 18 questions, and Meta answered all of them.
7. Public comments
The Oversight Board considered 130 public comments related to these cases. Ninety-seven of the comments were submitted from United States & Canada, 19 from Europe, 10 from Asia Pacific & Oceania, one from Latin America and the Caribbean, one from the Middle East and North Africa, one from Sub-Saharan Africa and one from Central & South Asia.
The submissions covered the following themes: erroneous removals of content from trans, non-binary, and female users; the unfairness and inequality of gender-based distinctions to determine what forms of nudity are permitted on the platform; confusion over what content is permissible under the Adult Nudity and Sexual Activity, and Sexual Solicitation Community Standards; and the importance of social media for expression in societies where LGBTQI+ rights are being threatened.
To read public comments submitted for these cases, please click here. Several comments submitted have not been included as they contained personally identifying information regarding individuals other than the commenter.
8. Oversight Board analysis
The Board looked at the question of whether these posts should be restored through three lenses: Meta's content policies, the company's values and its human rights responsibilities.
The Board selected these cases as the removal of non-violating content posted by people who identify with marginalized groups affects their freedom of expression. This is particularly significant as Instagram can be an important forum for these groups to build community. These cases demonstrate how enforcement errors may have a disproportionate impact on certain groups and may signify wider issues in policy and enforcement that should be fixed.
8.1 Compliance with Meta’s content policies
The Board finds these posts do not violate any Meta content policy. While the Community Guidelines apply to Instagram, Meta also states that “Facebook and Instagram share content policies. Content that is considered violating on Facebook is also considered violating on Instagram.” The Facebook Community Standards provide more detail and are linked in the Guidelines.
a. Sexual Solicitation
The Sexual Solicitation Community Standard states that implicit sexual solicitation requires two elements:
- Content that contains an implicit offer or ask AND
- Sexually suggestive elements.
Implicit offer or ask.
An implicit offer or ask is defined in the Sexual Solicitation Community Standard as “Content that implicitly or indirectly (typically through providing a method of contact) offers or asks for sexual solicitation.” In Meta’s "Known Questions," which provide additional internal guidance to reviewers, the list of contact information that triggers removal as an implicit offer includes social media profile links and “links to subscription-based websites (for example, OnlyFans.com or Patreon.com).” In these cases, the content provided a link to a platform where the users were hosting a fundraiser to pay for surgery. Because Meta’s internal criteria defining "implicit offer or ask" are very broad, this link would technically qualify as an “offer or ask” under Meta’s reviewer guidance despite not violating the public facing standard, which indicates the offer or ask must be for something sexual.
Sexually suggestive element.
The Community Standard provides a list of sexually suggestive elements which includes poses. The Known Questions provide a list, described by Meta as exhaustive, of what are characterized as sexually suggestive poses, including nude “female breasts covered either digitally or by human body parts or objects.” In both images, the Board notes there are breasts covered by human body parts (hands) or objects (tape). In these cases, the content of the posts makes clear that the subjects of the photo identify as trans and non-binary, meaning that the breasts depicted belong to individuals who do not identify as women. The Board also finds the content is not sexually suggestive. On that basis, the second element required to violate the Sexual Solicitation policy - a sexually suggestive element such as a sexual pose (which includes a covered female breast) - was not met.
Because the second element was not satisfied, the posts did not violate this standard. Applying the public version of the first element (which indicates the offer/ask must be for something sexual) also indicates that these images would not constitute sexual solicitation.
b. Adult Nudity and Sexual Activity
The Adult Nudity and Sexual Activity Community Standard states that users should not post images of “uncovered female nipples except in the context of breastfeeding, birth giving and after-birth moments, medical or health context (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest.” Meta’s Known Questions further state that reviewers should allow “imagery of nipples when shared in an explicitly female-to-male transgender, non-binary, or gender-neutral context (e.g., a user indicates such gender identity), regardless of size or shape of breast.” Neither image in these cases violates this Community Standard.
First, neither of the images feature uncovered nipples. In both images, the individuals have covered their nipples with either their hands or tape. Second, had the nipples been uncovered, the Board notes the images were shared with accompanying text that made clear the individuals identify as non-binary. This policy is therefore not violated.
8.2 Compliance with Meta’s values
The Board finds that the original decisions to remove these posts were inconsistent with Meta's values of "Voice" and "Dignity" and did not serve the value of "Safety."
Enforcement errors that disproportionately affect groups facing discrimination pose a serious threat to “Voice” and “Dignity.” While Meta’s human rights arguments discussed “Safety,” particularly related to non-consensual image sharing, sex trafficking, and child abuse, the Board finds these removals did not advance “Safety.”
8.3 Compliance with Meta’s human rights responsibilities
Freedom of expression (Article 19 ICCPR)
Article 19 of the ICCPR provides for broad protection of expression, including discussion of human rights and expression which people may find offensive ( General Comment 34, para. 11). The right to freedom of expression is guaranteed to all people without discrimination as to “sex” or “other status” ( ICCPR, Article 2, para 1). The Human Rights Committee has confirmed in cases such as Nepomnyashchiy v Russia ( CCPR/C/123/D/2318/2013) that the prohibition on discrimination includes discrimination on the grounds of gender identity.
The content relates to important social issues. For these users, Instagram provides a forum to discuss and represent their gender expression, offering a forum to make connections and derive support. The content may also directly affect the users’ ability to pursue gender confirmation surgery, as both posts explain that one person will undergo top surgery and share a fundraiser to offset the surgery costs.
Article 19 requires that where restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality ( ICCPR, Article 19, para. 3). Relying on the UNGPs framework, the UN Special Rapporteur on freedom of opinion and expression has called on social media companies to ensure that their content rules are guided by the requirements of Article 19, para. 3, ICCPR ( A/HRC/38/35, paras. 45 and 70). The Board has adopted this framework to analyze Meta’s policies and enforcement.
In this case, the Board finds that Meta has not met its responsibilities to create and enforce policies that align with these standards. The internal criteria applied to remove content under the Sexual Solicitation policy are more expansive than the stated rationale for the policy, with overenforcement consequences that Meta itself has recognized. The Adult Nudity and Sexual Activity Community Standard disproportionately impacts women and LGBTQI+ users and relies on subjective and speculative perceptions of sex and gender that are not practicable when engaging in content moderation at scale. The Board analyzes these shortcomings and recommends Meta begin a comprehensive process to address these problems.
I. Legality (clarity and accessibility of the rules)
Rules restricting expression must be clear and accessible so that both those responsible for enforcing them, and users, know what is allowed. Both Community Standards considered in these cases fall short of that standard.
a. Sexual Solicitation
The Board finds that the Sexual Solicitation Community Standard contains overbroad criteria in the internal guidelines provided to reviewers. This poorly tailored guidance contributes to over-enforcement by reviewers and confusion for users. Meta acknowledged this, as it explained to the Board that applying its internal guidance could “lead to over-enforcement" in cases where the criteria for implicit sexual solicitation are met but it is clear that there was “no intention to solicit sex.”
The confusion is reflected in both elements of this policy. In relation to the ‘offer or ask’ component of the Sexual Solicitation Community Standard, the public-facing rules refer to a “method of contact” for the soliciting party. However, the guidance for moderators, the Known Questions, state that a “method of contact” for an implicit ‘offer or ask’ includes social media profile links or links to third-party subscription-based websites such as Patreon. It is not made clear to users that any link to another social media profile, third-party payment platform or fundraising link (such as Patreon or GoFundMe) could mean that their post is treated as a solicitation. This confusion is reflected in the many public comments the Board received from people who did not understand why content including such third-party links was removed or led to their accounts being banned.
The second criterion, requiring a sexually suggestive element, is broad and vague, as well as inconsistent with Meta’s Adult Nudity and Sexual Activity policy. The public-facing Community Standard includes “sexually suggestive poses” as a sexually suggestive element. The Known Questions then provide a detailed list of “sexually suggestive poses” which includes being topless and covering breasts with hands or objects. Users will likely not be able to predict that any image with covered breasts is considered a sexually suggestive pose. This confusion is compounded by the fact that the Adult Nudity policy permits topless photos where the nipples are covered. In this respect, content that is considered sexual under one policy is not considered sexual under another policy.
In addition to user uncertainty, the fact that reviewers repeatedly reached different outcomes about this content suggests a lack of clarity for moderators on what content should be considered sexual solicitation.
As Meta acknowledges, the application of its internal guidance on the two elements of implicit solicitation is removing content that does not seek sexual acts. In the longer term, erroneous removals will likely be best addressed by modifying the scope of this policy. In the short term, however, the Board recommends that Meta revise its internal guidelines to ensure that the criteria reflect the public-facing rules and require a clearer connection between the "offer or ask" and the "sexually suggestive element." Meta should also provide users with more explanation of what constitutes an "offer or ask" for sex and what constitute sexually suggestive poses in the public Community Standards.
b. Adult Nudity and Sexual Activity
The Adult Nudity and Sexual Activity Standard is premised on sex and gender distinctions that are difficult to implement and contain exceptions that are poorly defined. Certain rules in the policy will be confusing to users, who do not know what is allowed. This also causes confusion for moderators, who must make subjective assessments based on unavoidably incomplete information and rapidly apply a rule with numerous factors, exclusions, and presumptions.
Despite using language focusing on specific body parts instead of gender (and allowing users to choose from a wide range of gender identities on their profile), most Meta rules do not explain how the company handles content depicting intersex, trans or non-binary people. For example, the policy refers to “male and female genitalia,” “female breasts” and “female nipples,” but it is unclear how these descriptions are applied to people with bodies and identities that may not align with these definitions. Many trans and non-binary people submitted public comments to the Board stating that users do not know if their content is assessed and categorized according to their gender identity, the sex they were assigned at birth, or aspects of their physical appearance.
The current rules require human reviewers to quickly assess both a user’s sex, as this policy applies to “female nipples,” and their gender identity, as there are exceptions based on whether the depicted person is non-binary, gender neutral, transgender, or posting in a gender confirmation surgery context. Perceptions of sex and gender require the interpretation of contextual clues and appearance, both of which are subjective determinations conducive to errors.
This approach is further complicated by Meta’s “default to female principle” whereby more restrictive policies applicable to female (as opposed to male) nudity are applied in situations of doubt. The Known Questions state that where there is no clear context and the person in the image “presents as female OR male-to-female transgender context exists, then default to female nudity and apply the relevant policy.”
The number of restrictions and exceptions to the rules on nipples perceived as female is extensive and confusing. Exceptions range from acts of protest to scenes of childbirth and breastfeeding, to medical and health contexts, including post-mastectomy images and breast cancer awareness. The exceptions are often not defined or poorly defined. The list of exceptions has also grown substantially over time and can be expected to continue to grow as expression evolves. When it comes to women’s breasts, Meta’s Adult Nudity and Sexual Activity policy makes the default assumption that such depictions constitute sexual imagery. Yet the expanding list of exceptions reflect that, under many circumstances recognized in the policy, images of women’s breasts are not sexually suggestive.
Even within each exception, numerous questions arise. For example, the gender confirmation surgery exception is of particular importance to trans and non-binary users, but Meta does not provide an explanation of the scope of its gender confirmation surgery exception in its public-facing rules. This has resulted in many of the public comments expressing confusion over whether permitted content under the exception could include pre-surgery photos (to create a before-and-after image) and images of trans women who have received breast augmentations. The internal guidelines and Known Questions make clear that this exception is narrower than the public guidance may be construed to imply.
Meta’s policies are premised on binary distinctions between male and female, creating challenges when Meta tries to articulate its gender confirmation surgery exception. In Meta’s responses to the Board, Meta explained that the gender confirmation surgery exception means that it allows “uncovered female nipples before the individual has top surgery to remove their breasts when the content is shared in an explicitly female-to-male transgender, non-binary, or gender-neutral context.” The rules further state “Nipples of male-to-female transgender women having undergone a breast augmentation (top surgery) are prohibited, unless scarring over nipple is present.”
The internal guidelines on surgical scarring and nipples are even more convoluted. The rules for mastectomies, for example, permit “Instances where the nipple is reconstructed from other tissue or stencilled or tattooed” and “instances where at least one surgically removed breast is present, even if the other bare female nipple is visible.” Even more confusingly, the rules state that “For mastectomies, scarring includes depiction of the area where the removed breast tissue used to be. The actual surgical scar does not need to be visible.”
Reviewers will likely struggle to apply rules that require that they rapidly assess sex-specific characteristics of the depicted person to decide whether to apply female nipple rules, and then the gender of the person to determine if some exceptions apply, and then consider whether the content depicts the precursor or aftermath of a surgical procedure, which surgical procedure, and the extent and nature of the visible scarring, to determine whether other exceptions may apply. The same image of female-presenting nipples would be prohibited if posted by a cisgender woman but permitted if posted by an individual self-identifying as non-binary. The Board also notes additional nipple-related exceptions based on contexts of protest, birth giving, after birth, and breastfeeding which it did not examine here, but also must be assessed and presumably involve additional internal criteria.
Given the importance of expressive rights about matters of gender, physical health, childbirth and parenting, the current complex patchwork of exceptions creates undue uncertainty for users and holds the potential for misapplied rules, as evidenced by this case. The lack of clarity for users and moderators inherent in this policy makes the standard unworkable. As further discussed below, the Board believes that Meta should adopt an approach to adult nudity that ensures that all people are treated without discrimination on the basis of sex or gender identity.
II. Legitimate aim
ICCPR Article 19 provides that when states restrict expression, they may only do so in furtherance of legitimate aims, which are set forth as: “respect for the rights or reputations of others . . . [and] the protection of national security or of public order (ordre public), or of public health and morals.” This decision examines Meta’s rationales for limiting speech in its policies in light of these standards.
a. Sexual Solicitation
Meta explains in its Sexual Solicitation policy that it is “intended to prevent users from using Facebook or Instagram to facilitate “transactions that may involve trafficking, coercion and non-consensual sexual acts” which could occur off-platform. This is an example of protecting the rights of others, which is a legitimate aim.
b. Adult Nudity and Sexual Activity
Meta provided several rationales for particular aspects of its Adult Nudity and Sexual Activity policy, including preventing the spread of non-consensual content, protecting minors where the age of the person is unclear and the fact that, “some people in our community may be sensitive to this type of content.” Meta also provided an explanation of its general principles on nudity to the Board. It states that “In drafting our policy, Meta considered (1) the private or sensitive nature of the imagery; (2) whether consent was given in the taking and sharing of nude images; (3) the risk of sexual exploitation; and (4) whether the disclosure of such images could lead to harassment off-platform, particularly in countries where such images may be culturally offensive.”
Most of these objectives align with protecting the rights of others. However, Meta’s rationale of protecting “community sensitivity” merits further examination. This rationale has the potential to align with the legitimate aim of “public morals.” That said, the Board notes that the aim of protecting “public morals” has sometimes been improperly invoked by governmental speech regulators to violate human rights, particularly those of members of minority and vulnerable groups. The Human Rights Committee has cautioned that “the concept of morals derives from many social, philosophical and religious traditions; consequently, limitations... for the purpose of protecting morals must be based on principles not deriving exclusively from a single tradition” (Human Rights Committee, General Comment 34).
While human rights law does recognize that public morals can constitute a legitimate aim of limitations on free expression for states, and public nudity restrictions exist around the world, Meta emphasizes aims other than “community sensitivities” in the specific context of this case. Meta stated that “[a]lthough [its] nudity policy is consistent with the protection of public morals [… it] is not ultimately based on this aim because moral standards around nudity differ so widely across cultures and would not be implementable at scale.” For example, in many communities and parts of the world, depictions of uncovered transgender and non-binary breasts might well be considered to traverse community sensitivities. Yet Meta does not restrict such expression. Moreover, the Board is concerned about the known and recurring disproportionate burden on expression that have been experienced by women, transgender, and non-binary people due to Meta’s policies (see below). For these reasons, the Board focuses on the other aims beyond “community sensitivities” that Meta has advanced in examining its human rights responsibilities.
It should be noted that some of the reasons Meta provides for its nudity policy reflect a default assumption of the sexually suggestive nature of women’s breasts as the basis. The Board received public comments from many users that expressed concern about the presumptive sexualization of women’s, trans and non-binary bodies, when no comparable assumption of sexualization of images is applied to cisgender men. (See, e.g., Public Comment 10624 submitted by InternetLab).
The Board received many public comments in this case through its normal case outreach processes. As a body committed to offering a measure of accountability to Meta’s user base and key stakeholders, the Board considers comments seriously as a part of its deliberations. As with all cases, we understand that these comments may not be representative of global opinion. The Board appreciates the experiences and expertise shared through comments and continues to take steps to increase the breadth of its outreach to communities that may not currently be participating in this process.
Finally, the Board recognizes that Meta may legitimately factor in the importance of preventing certain harms that can have gendered impacts. As noted by the United Nations Special Rapporteur on violence against women, it is "important to acknowledge that the Internet is being used in a broader environment of widespread and systemic structural discrimination and gender-based violence against women and girls" (A/HRC/38/47). Further, surveys indicate that "90 per cent of those victimized by non-consensual digital distribution of intimate images are women." (A/HRC/38/47). Meta should seek to limit gendered harms, both in the over-enforcement and under-enforcement of nudity prohibitions.
III. Necessity and proportionality
The Board finds that Meta’s policies, as framed and enforced, capture more content than necessary. Neither policy is proportionate to the issues they are trying to address.
a. Sexual Solicitation
The Sexual Solicitation policy’s definitions of an implicit "offer or ask" and sexually suggestive poses are overbroad and bound to capture a significant amount of content unrelated to sexual solicitation. Meta itself acknowledges the risk of erroneous enforcement, stating it is “aware that some content reviewers may incorrectly remove content as implicit sexual solicitation (even though it is not) based on an overly-technical application of [its] internal reviewer guidance.” Meta continued:
Currently, based on our Known Questions, we consider sharing, mentioning, or providing contact information of social or digital identities to be an implicit offer or ask for sexual solicitation. […] However, applying this guidance can lead to over-enforcement in cases where, for instance, a model is perceived by a reviewer as posing in a sexually suggestive way (meets the “sexually suggestive element” criterion) and tags the photographer to give them credit for the picture (meets the “offer or ask” criterion). This type of content is non-violating because there is no intention to solicit sex, but it may still be removed (contrary to the policy) because it otherwise meets the two criteria outlined above.
UNESCO, in a report discussing education in the digital space, described the risk of mistaken over-enforcement. Noting that “strict regulations concerning the sharing of explicit images means that, in some cases, educational materials published online to support learning about the body, or sexual relationships, may be mistaken by moderators for inappropriate, explicit content and therefore removed from generic web platforms.” The Board also notes the many public comments it received discussing erroneous removals under this Standard. For example, ACON, (an HIV education NGO in Australia) writes that content that promoted HIV prevention messaging in a sex-positive way and content promoting education workshops have been removed for sexual solicitation. This has resulted in the NGO choosing the language in their content to avoid Meta removals, instead of choosing the best language to reach their targeted communities (Public Comment 10550). This was echoed by Joanna Williams, a researcher who found that nine out of twelve of the sexual health organizations she interviewed reported being negatively affected by Meta’s moderation in this area (Public Comment 10613).
b. Adult Nudity and Sexual Activity
In addition to the challenges in establishing enforceable and scalable rules based on Meta’s perception of sex and gender identity, as described above, the Board also finds that Meta’s policies on adult nudity pose disproportionate restrictions on some types of content and expression. The policies mandate the removal of content, when less restrictive measures could achieve the stated policy goals.
Meta already uses a diverse range of enforcement actions aside from removal, including applying warning screens and age-gating content to only permit users over the age of 18 to view it. Further, it already employs such measures within its Adult Nudity and Sexual Activity policy, including for artistic depictions of sexual activity. Meta may also wish to engage automated and human moderators to make more refined, context-specific determinations of when nude context is actually sexual, regardless of the gender of the body it depicts. Meta could further employ a wide range of policy interventions to limit the visibility of nude content to users who do not wish to see it by enabling greater user control. Meta also has a number of dedicated policies on issues it is also addressing through the nudity policy (such as the Adult Exploitation Policy and the Child Sexual Exploitation, Abuse and Nudity policies) that could be strengthened.
The Board notes that Meta’s enforcement practices reportedly result in a high number of false-positives, or mistaken removal of non-violating content. Meta’s last Community Standards Enforcement report for Instagram ( April-June 2022) disclosed that 21% of the Adult Nudity and Sexual Activity removals that were appealed led to the content being restored. The Board also received a high number of public comments concerning the mistaken removal of content under the Adult Nudity Policy.
Non-discrimination
There is evidence that Meta’s policies and enforcement relating to the Adult Nudity and Sexual Activity policy can lead to disproportionate impacts specifically for women and LGBTQI+ people. These impacts are reflected in both policy and enforcement and limit the ways in which groups can express themselves, resist prejudice and increase their visibility in society.
While this case concerned trans and nonbinary users, the enforcement errors in this case stem from an underlying policy that also impacts women, especially as Meta adopts a ‘default to female’ approach for nude content. Therefore, this section considers how Meta’s policies impact both LGBTQI+ people and women. The large volume of public submissions in this case provided many illustrations of the impact these policies can have.
The right to freedom of expression is guaranteed to all people without discrimination as to "sex" or "other status" (Article 2, para. 1, ICCPR). This includes sexual orientation and gender identity ( Toonen v. Australia (1994); A/HRC/19/41, para. 7). The Human Rights Committee’s jurisprudence notes that “not every differentiation based on the grounds listed in article 26 of the Covenant amounts to discrimination, as long as it is based on reasonable and objective criteria and in pursuit of an aim that is legitimate under the Covenant.” Nepomnyashchiy v Russia, Human Rights Committee, 2018, para. 7.5 (CCPR/C/123/D/2318/2013).
The Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW) prohibits “any distinction, exclusion or restriction made on the basis of sex which has the effect or purpose of impairing or nullifying the recognition, enjoyment or exercise by women […] on a basis of equality of men and women, of human rights and fundamental freedoms in the political, economic, social, cultural, civil or any other field” (CEDAW, Art. 1). The Board notes that international human rights bodies have not addressed the human rights implications of either permitting or prohibiting consensual adult nudity and its potential discriminatory impacts.
The UN Guiding Principles state that "business enterprises should pay special attention to any particular human rights impacts on individuals from groups or populations that may be at heightened risk of vulnerability or marginalization" ( UNGPs, Principles 18 and 20). The Special Rapporteur on freedom of expression has urged tech companies to “actively seek and take into account the concerns of communities historically at risk of censorship and discrimination” ( A/HRC/38/35, para 48). The United Nations Working Group on Business and Human Rights has also recommended that technology companies ensure that “artificial intelligence and automation do not have disproportionate adverse impacts on women’s human rights” ( Gender Dimensions Handbook).
Given the importance of social media platforms as an arena for expression for individuals subject to discrimination, the Board has consistently articulated its expectation that Meta be particularly sensitive to the possibility of wrongful removal of content by, about or depicting members of these groups. As the Board noted in the "Wampum belt" decision ( 2021-012-FB-UA) regarding artistic expression from Indigenous persons, it is not sufficient to evaluate the performance of Meta's enforcement of Facebook's Hate Speech policy on the user population as a whole – effects on specific groups must be taken into account. Similarly, in the “Reclaiming Arabic words” case, the Board confirmed that “the over-moderation of speech by users from persecuted minority groups is a serious threat to their freedom of expression” and expressed concern about how exemptions in Meta’s policies (in that case, the Hate Speech policy) were applied to expression from marginalized groups ( 2022-003-IG-UA).
The consequences of Meta’s choices result in disparate opportunities for expression being made available to women, trans, and gender non-binary people on its platforms. Meta’s current Adult Nudity and Sexual Activity policy treats female breasts and nipples as inherently sexual, and thus subject to prohibition, unless they have or will be operated on surgically or are in the act of breastfeeding. Instead of taking steps to ensure that censorship does not disproportionately impact some groups, Meta's policy entrenches and perpetuates such impacts on these groups.
These cases highlight the disproportionate impact of Meta’s policy choices for people who identify as LGBTQI+, as content was identified multiple times by Adult Nudity and Sexual Activity classifiers despite falling outside the scope of the policy. The Board believes that these cases are emblematic of broader problems. For example, the Haimson et al. study found that transgender people report high levels of content being removed and accounts being deleted, typically due to nudity and sexual content.
The enforcement of Meta’s policy choices also has a disproportionate impact on women. A study by Witt, Suzor, and Higgins found that up to 22% of images of women’s bodies that were removed from Instagram were apparent false positives The differential impact on women’s bodies was also noted in public comments (see, e.g., Public Comment 10616 by Dr. Zahra Stardust).
This default position on nudity also has a severe impact in contexts where women may traditionally go bare-chested. The Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression has urged that companies engage with indigenous groups around the world to “develop better indicators for taking into account cultural and artistic context when assessing content featuring nudity” (para 54, A/HRC/38/35).
In addition to these policy-related non-discrimination concerns, these cases also raised enforcement-related non-discrimination concerns. Meta should be mindful that content from users who identify with marginalized groups are at greater risk of repeated or malicious reporting, where users report non-violating content in order to burden or harass users. These issues were also raised by several public comments (see, e.g., Public Comment 10596 by GLAAD and Public Comment 10628 by The Human Rights Campaign Foundation).
These cases highlighted that multiple reports that generate multiple reviews can increase the likelihood of mistaken removals. Indeed, in this case, most of the user reports resulted in human reviews that found the content to be non-violating, but the content continued to be reported until reviewers mistakenly determined it to be violating and removed it.
Meta should see to develop and implement policies that help ameliorate all these concerns. These could include more uniform policies with respect to nudity that apply without discrimination on the basis of sex or gender identity. They might also include more contextualized determinations of what content is sexual, as long as such determinations avoid reliance on discriminatory criteria in making such determinations.
The Board notes that Meta has a dedicated policy to address non-consensual intimate imagery in its Adult Sexual Exploitation policy. This has been an arena for prioritized enforcement by the company (see, for example, their introduction of automated detection technology to stop non-consensual images being repeatedly posted). When Meta considers changing its approach to managing nudity on the platform, Meta should closely examine the degree to which the Adult Nudity and Sexual Activity policy protects against the sharing of the non-consensual imagery and to understand whether changes in the Adult Sexual Exploitation policy or its enforcement may be needed to strengthen its efficacy.
The Board also recognizes that Meta may have a legitimate interest in limiting sexual or pornographic content on its platform. But the Board believes that relevant business objectives can and should be met with approaches that treat all users without discrimination.
Some of the Board members believe that Meta should seek to reduce the discriminatory impact of its current policies by adopting an adult nudity policy that is not based on differences of sex or gender. They noted the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW) provisions on eliminating gendered stereotypes (see, for example Articles 5 and 10) and Meta’s explicit commitment to CEDAW in its corporate human rights policy. They concluded that, in the context of the nudity policy, these international human rights standards and Meta’s own commitment to non-discrimination, support eliminating stereotyped distinctions. This group of members have concluded that shifting towards a policy that is not based on sex or gender differences would be the best way for Meta to uphold its human rights responsibilities as a business, given its corporate values and commitments. These members note that norms and policies should evolve to address conventions that have discriminatory impacts, as many forms of discrimination were or continue to be widespread social convention.
There was some disagreement among Board Members on these issues. Some members agreed in principle that Meta should not rely on sex or gender to limit expression but were deeply sceptical of Meta’s capacity to effectively address non-consensual intimate imagery and other potential harms without a sex and gender conscious nudity policy. Other members of the Board believe that because applicable human rights principles on non-discrimination allow for distinctions on the grounds of protected characteristics so long as they are “based on reasonable and objective criteria and in pursuit of an aim that is legitimate under the Covenant,” ( Nepomnyashchiy v Russia, Human Rights Committee, 2018, para. 7.5 (CCPR/C/123/D/2318/2013), a sex and gender-neutral nudity policy is not required and could cause or exacerbate other harms.
The Board Members who support a sex and gender-neutral adult nudity policy recognize that under international human rights standards as applied to states, distinctions on the grounds of protected characteristics may be made based on reasonable and objective criteria and when they serve a legitimate purpose. They do not believe that the distinctions within Meta’s nudity policy meet that standard. They further note that, as a business, Meta has made human rights commitments that are inconsistent with an approach that restricts online expression based on the company’s perception of sex and gender.
The Adult Nudity and Sexual Activity Community Standard disproportionately impacts women and LGBTQI+ users and relies on subjective and speculative perceptions of sex and gender that are not practicable when engaging in content moderation at scale. Viewed comprehensively, given the confusion around the rules and their enforcement, and the disproportionate and discriminatory impact of Meta’s current Adult Nudity and Sexual Activity policy, the Board recommends that Meta define clear, objective, rights-respecting criteria to govern the entirety of its Adult Nudity Policy, ensuring treatment of all people that is consistent with international human rights standards, including without discrimination on the basis of sex or gender identity. Meta should first conduct a comprehensive human rights impact assessment to review the implications of the adoption of such criteria, which includes broadly inclusive stakeholder engagement across diverse ideological, geographic and cultural contexts. To the degree that this assessment should identify any potential harms, implementation of the new policy should include a mitigation plan for addressing them. The Board requests a report on the assessment and plan six months from the date of issue of this decision.
9. Oversight Board decision
The Oversight Board overturns Meta's original decisions to remove both posts, requiring them to be restored.
10. Policy advisory statement
Content policy
1. In order to treat all users fairly and provide moderators and the public with a workable standard on nudity, Meta should define clear, objective, rights-respecting criteria to govern the entirety of its Adult Nudity and Sexual Activity policy, ensuring treatment of all people that is consistent with international human rights standards, including without discrimination on the basis of sex or gender identity. Meta should first conduct a comprehensive human rights impact assessment to review the implications of the adoption of such criteria, which includes broadly inclusive stakeholder engagement across diverse ideological, geographic and cultural contexts. To the degree that this assessment should identify any potential harms, implementation of the new policy should include a mitigation plan for addressing them.
2. In order to provide greater clarity to users, Meta should provide users with more explanation of what constitutes an "offer or ask" for sex (including links to third party websites) and what constitute sexually suggestive poses in the public Community Standards. The Board will consider this recommendation implemented when an explanation of these terms with examples is added to the Sexual Solicitation Community Standard.
Enforcement
3. In order to ensure that Meta’s internal criteria for its Sexual Solicitation policy do not result in the removal of more content than the public-facing policy indicates and so that non-sexual content is not mistakenly removed, Meta should revise its internal reviewer guidance to ensure that the criteria reflect the public-facing rules and require a clearer connection between the "offer or ask" and the "sexually suggestive element." The Board will consider this implemented when Meta provides the Board with its updated internal guidelines that reflect these revised criteria.
*Procedural note:
The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.