Oversight Board Overturns Original Facebook Decision in Breast Cancer Symptoms and Nudity Case
January 28, 2021
The Oversight Board has overturned Facebook’s decision to remove a post on Instagram. After the Board selected this case, Facebook restored the content. Facebook’s automated systems originally removed the post for violating the company’s Community Standard on Adult Nudity and Sexual Activity. The Board found that the post was allowed under a policy exception for “breast cancer awareness” and Facebook’s automated moderation in this case raises important human rights concerns.
About the case
In October 2020, a user in Brazil posted a picture to Instagram with a title in Portuguese indicating that it was to raise awareness of signs of breast cancer. The image was pink, in line with “Pink October,” an international campaign to raise awareness of this disease. Eight photographs within the picture showed breast cancer symptoms with corresponding descriptions. Five of them included visible and uncovered female nipples, while the remaining three photographs included female breasts, with the nipples either out of shot or covered by a hand. The post was removed by an automated system enforcing Facebook’s Community Standard on Adult Nudity and Sexual Activity. After the Board selected the case, Facebook determined this was an error and restored the post.
Key findings
In its response, Facebook claimed that the Board should decline to hear this case. The company argued that, having restored the post, there was no longer disagreement between the user and Facebook that the content should stay up, making this case moot.
The Board rejects Facebook’s argument. The need for disagreement applies only at the moment the user exhausts Facebook’s internal appeal process. As the user and Facebook disagreed at that time, the Board can hear the case.
Facebook’s decision to restore the content also does not make this case moot, as the company claims. On top of making binding decisions on whether to restore pieces of content, the Board also offers users a full explanation for why their post was removed.
The incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns. The detection and removal of this post was entirely automated. Facebook’s automated systems failed to recognize the words “Breast Cancer,” which appeared on the image in Portuguese, and the post was removed in error. As Facebook’s rules treat male and female nipples differently, using inaccurate automation to enforce these rules disproportionately affects women’s freedom of expression. Enforcement which relies solely on automation without adequate human oversight also interferes with freedom of expression.
In this case, the user was told that the post violated Instagram’s Community Guidelines, implying that sharing photos of uncovered female nipples to raise breast cancer awareness is not allowed. However, Facebook’s Community Standard on Adult Nudity and Sexual Activity, expressly allows nudity when the user seeks to “raise awareness about a cause or educational or medical reasons” and specifically permits uncovered female nipples to advance “breast cancer awareness.” As Facebook’s Community Standards apply to Instagram, the user’s post is covered by the exception above. Hence, Facebook’s removal of the content was inconsistent with its Community Standards.
The Oversight Board’s decision
The Oversight Board overturns Facebook’s original decision to remove the content and requires that the post be restored. The Board notes that Facebook has already taken action to this effect.
The Board recommends that Facebook:
- Inform users when automated enforcement is used to moderate their content, ensure that users can appeal automated decisions to a human being in certain cases, and improve automated detection of images with text-overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.
- Revise Instagram’s Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between Instagram’s Community Guidelines and Facebook’s Community Standards, the latter take precedence.
For further information:
To read the full case decision, click here.
To read a synopsis of public comments for this case, click here.