Oversight Board Overturns Meta's Decision in Call for Women’s Protest in Cuba Case
October 3, 2023
The Oversight Board has overturned Meta’s decision to remove a video posted by a Cuban news platform on Instagram in which a woman protests against the Cuban government, calls for other women to join her on the streets and criticizes men, by comparing them to animals culturally perceived as inferior, for failing to defend those who have been repressed. The Board finds the speech in the video to be a qualified behavioral statement that, under Meta’s Hate Speech Community Standard, should be allowed. Furthermore, in countries where there are strong restrictions on people’s rights to freedom of expression and peaceful assembly, it is critical that social media protects the users’ voice, especially in times of political protest.
About the Case
In July 2022, a news platform, which describes itself as critical of the Cuban government, posted a video on its verified Instagram account. The video shows a woman calling on other women to join her on the streets to protest against the government. At a certain point, she describes Cuban men as “rats” and “mares” carrying urinal pots, because they cannot be counted on to defend people being repressed by the government. A caption in Spanish accompanying the video includes hashtags that refer to the “dictatorship” and “regime” in Cuba, and it calls for international attention on the situation in the country, by using #SOSCuba.
The video was shared around the first anniversary of the nationwide protests that had taken place in July 2021 when Cubans took to the streets, in massive numbers, for their rights. State repression increased in response, continuing into 2022. The timing of the post was also significant because it was shared days after a young Cuban man was killed in an incident involving the police. The woman in the video appears to reference this when she mentions that “we cannot keep allowing the killing of our sons.” Text overlaying the video connects political change to women’s protests.
The video was played more than 90,000 times and shared fewer than 1,000 times.
Seven days after it was posted, a hostile speech classifier identified the content as potentially violating and sent it for human review. While a human moderator found the post violated Meta’s Hate Speech policy, the content remained online as it went through additional rounds of human review under the cross-check system. A seven-month gap between these rounds meant the post was removed in February 2023. On the same day in February, the user who shared the video appealed Meta’s decision. Meta upheld its decision, without escalating the content to its policy or subject matter experts. A standard strike was applied to the Instagram account, but no feature limit.
Key Findings
The Board finds that, when read as a whole, the post does not intend to dehumanize men based on their sex, trigger violence against them or exclude them from conversations about the Cuban protests. The post unambiguously aims to call attention to the woman’s opinion about the behavior of Cuban men in the context of the historic demonstrations that began in July 2021. With the woman using language such as “rats” or “mares” to imply cowardice in that precise context, and to express her own personal frustration at their behavior, regional experts and public comments point to the post as a call-to-action to Cuban men.
If taken out of context and given an overly literal reading, the stated comparison of men to animals culturally perceived as inferior could be seen as violating Meta’s Hate Speech policy. However, the post, when taken as a whole, is not a generalization that aims to dehumanize men, but instead a qualified behavioral statement, which is allowed under the policy. Consequently, the Boards finds that the removal of the content is inconsistent with Meta’s Hate Speech policy.
Furthermore, with external experts flagging the hashtag #SOSCuba, posted by the user to draw attention to the economic, political and humanitarian crises facing Cubans, the protests are established as an important point of historical reference. The Board is concerned about how contextual information is factored into Meta’s decisions on content that does benefit from additional human review. In this case, even though the content underwent escalated review–a process that is supposed to deliver better results–Meta still failed to get it right.
Meta should ensure that both its automated systems and content reviewers are able to factor contextual information into their decision-making process.
In this case, it was particularly important to protect the content. Cuba is characterized by closed civic spaces, so the risks associated with dissent are high, and access to internet is very restricted. In this case, relevant context may not have been sufficiently considered as part of the escalation process. Meta should consider how context influences its policies and the way in which they are enforced.
The Oversight Board’s Decision
The Oversight Board overturns Meta’s decision to remove the post.
While the Board makes no new recommendations in this case, it reiterates relevant ones from previous decisions, for Meta to follow closely:
- Have a list-based over-enforcement prevention program to protect expression in line with Meta’s human rights responsibilities, which should be distinct to the one that protects expression viewed by Meta as a business priority (recommendation no. 1 from the cross-check policy advisory opinion). This separate system should also ensure Meta provides additional layers of review to content posted by, among others, human rights defenders.
- Use specialized staff, with the benefit of local input, to create over-enforcement prevention lists (recommendation no. 8 from the cross-check policy advisory opinion).
- Improve how its workflow dedicated to meet its human rights responsibilities incorporates context and language expertise on enhanced review, specifically at decision-making levels (recommendation no. 3 from the cross-check policy advisory opinion).
- To ensure context is appropriately factored into content moderation, update guidance to its at-scale moderators with specific attention to rules around qualification, since the current guidance makes it virtually impossible for moderators to make the correct decisions (recommendation no. 2 from the Violence against women decision).
For Further Information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click here.