New Decision: Errors in Meta’s Enforcement Against Viral Video From Nigeria Put Men Alleged to be Gay at Risk
October 15, 2024
The Oversight Board is seriously concerned about Meta’s failure to take down a video showing two bleeding men who appear to have been beaten for allegedly being gay. The content was posted in Nigeria, which criminalizes same-sex relationships. In overturning the company’s original decision, the Board notes that by leaving the video on Facebook for five months, there was a risk of immediate harm to the men by exposing their identities, given the hostile environment for LGBTQIA+ people in Nigeria. Such damage is immediate and impossible to undo. The content, which shared and mocked violence and discrimination, violated four different Community Standards, was reported multiple times and reviewed by three human moderators. This case reveals systemic failings around enforcement. The Board’s recommendations include a call for Meta to assess enforcement of the relevant rule under the Coordinating Harm and Promoting Crime Community Standard. They also address the failings likely to have arisen from Meta identifying the wrong language being spoken in the video and how the company handles languages it does not support for at-scale content review.
About the Case
A Facebook user in Nigeria posted a video that shows two bleeding men who look like they could have been tied up and beaten. People around the frightened men ask them questions in one of Nigeria’s major languages, Igbo. In response, one of the men responds with his name and explains, seemingly under coercion, that he was beaten for having sex with another man. The user who posted this content included an English caption mocking the men, stating they were caught having sex and that this is “funny” because they are married.
The video was viewed more than 3.6 million times. Between December 2023 when it was posted and February 2024, 92 users reported the content, the majority for violence and incitement or hate speech. Two human reviewers decided it did not violate any of the Community Standards so should remain on Facebook. One user appealed to Meta but, after another human review, the company decided again there were no violations. The user then appealed to the Board. After the Board brought the case to Meta’s attention, the company removed the post under its Coordinating Harm and Promoting Crime policy.
Nigeria criminalizes same-sex relationships, with LGBTQIA+ people facing discrimination and severe restrictions on their human rights.
Key Findings
The Board finds the content violated four separate Community Standards, including the Coordinating Harm and Promoting Crime rule that does not allow individuals alleged to be members of an outing-risk group to be identified. The man’s admission in the video of having sex with another man is forced, while the caption explicitly alleges the men are gay. The content also broke rules on hate speech, bullying and harassment, and violent and graphic content.
There are two rules on outing under the Coordinating Harm and Promoting Crime policy. The first is relevant here and applied at-scale. It prohibits: “outing: exposing the identity or locations affiliated with anyone who is alleged to be a member of an outing-risk group.” There is a similar rule applied only when content is escalated to Meta’s experts. The Board is concerned that Meta does not adequately explain the differences between the two outing rules and that the rule applied at-scale does not publicly state that “outing” applies to identifying people as LGBTQIA+ in countries where there is higher risk of offline harm, such as Nigeria. Currently, this information is only available in internal guidance. This ambiguity could lead to confusion, preventing users from complying with the rules, and hindering people targeted by such abusive content to get these posts removed. Meta needs to update its public rule and provide examples of outing-risk groups.
This content was left up for about five months, despite breaking four different rules and featuring violence and discrimination. Human moderators reviewed the content and failed to identify that it broke the rules. With the video left up, the odds of someone identifying the men and of the post encouraging users to harm other LGBTQIA+ people in Nigeria increased. The video was eventually taken down but by this time, it had gone viral. Even after it was removed, the Board’s research shows there were still sequences of the same video remaining on Facebook.
When the Board asked Meta about its enforcement actions, the company admitted two errors. First, its automated systems that detect language identified the content as English, before sending it to human review, while Meta’s teams then misidentified the language spoken in the video as Swahili. The correct language is Igbo, spoken by millions in Nigeria, but this is not supported by Meta for content moderation at-scale. If the language is not supported, as in this case, then content is sent instead to human reviewers who work across multiple languages and rely on translations provided by Meta’s technologies. This raises concerns about how content in unsupported languages is treated, the choice of languages the company supports for at-scale review and the accuracy of translations provided to reviewers working across multiple languages.
The Oversight Board’s Decision
The Oversight Board overturns Meta’s original decision to leave up the content.
The Board recommends that Meta:
- Update the Coordinating Harm and Promoting Crime Community Standard’s at-scale prohibition on “outing” to include illustrative examples of “outing-risk groups,” including LGBTQIA+ people in countries where same-sex relationships are forbidden and/or such disclosures create significant safety risks.
- Conduct an assessment of the enforcement accuracy of the at-scale prohibition on exposing the identity or locations of anyone alleged to be a member of an outing-risk group, under the Coordinating Harm and Promoting Crime Community Standard.
- Ensure its language detection systems precisely identify content in unsupported languages and provide accurate translations of such content to language-agnostic reviewers.
- Ensure that content containing an unsupported language, even if this is combined with supported languages, is routed to agnostic review. This includes giving reviewers the option to re-route content containing an unsupported language to agnostic review.
For Further Information
To read public comments for this case, click here.