Oversight Board Overturns Meta's Original Decisions in the "Gender Identity and Nudity" Cases
January 17, 2023
The Oversight Board has overturned Meta’s original decisions to remove two Instagram posts depicting transgender and non-binary people with bare chests. It also recommends that Meta change its Adult Nudity and Sexual Activity Community Standard so that it is governed by clear criteria that respect international human rights standards.
About the Case
In this decision, the Oversight Board considers two cases together for the first time. Two separate pieces of content were posted by the same Instagram account, one in 2021, the other in 2022. The account is maintained by a US-based couple who identify as transgender and non-binary.
Both posts feature images of the couple bare-chested with the nipples covered. The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.
Following a series of alerts by Meta’s automated systems and reports from users, the posts were reviewed multiple times for potential violations of various Community Standards. Meta ultimately removed both posts for violating the Sexual Solicitation Community Standard, seemingly because they contain breasts and a link to a fundraising page.
The users appealed to Meta and then to the Board. After the Board accepted the cases, Meta found it had removed the posts in error and restored them.
Key Findings
The Oversight Board finds that removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities. These cases also highlight fundamental issues with Meta’s policies.
Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.
In at least one of the cases, the post was sent for human review by an automated system trained to enforce the Adult Nudity and Sexual Activity Community Standard. This Standard prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery.
This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.
The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.
The Board has consistently said Meta must be sensitive to how its policies impact people subject to discrimination (see for example, the “ Wampum belt” and “ Reclaiming Arabic words” decisions). Here, the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies.
Meta should seek to develop and implement policies that address all these concerns. It should change its approach to managing nudity on its platforms by defining clear criteria to govern the Adult Nudity and Sexual Activity policy, which ensure all users are treated in a manner consistent with human rights standards. It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard.
The Oversight Board's Decision
The Oversight Board overturns Meta's original decision to remove the posts.
The Board also recommends that Meta:
- Define clear, objective, rights-respecting criteria to govern its Adult Nudity and Sexual Activity Community Standard, so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender. Meta should first conduct a comprehensive human rights impact assessment on such a change, engaging diverse stakeholders, and create a plan to address any harms identified.
- Provide more detail in its public-facing Sexual Solicitation Community Standard on the criteria that lead to content being removed.
- Revise its guidance for moderators on the Sexual Solicitation Community Standard so that it more accurately reflects the public rules on the policy. This would help to reduce enforcement errors on Meta’s part.
For Further Information
To read the full decision please click on the attachment below.
To read a synopsis of public comments for this case, please click here.