Announcing the Board’s Next Cases
July 13, 2021
Today the Board is announcing two new cases for consideration.
Case Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook's policies.
The cases we are announcing today are:
2021-010-FB-UA
User appeal to restore content to Facebook
Submit public comment here.
Note: To allow people to provide comments on the nature and impact of the post and help people understand the Board’s eventual ruling in this case, we are sharing some of the exact words used in this post. We do so in the interest of transparency while recognizing that some of the quoted language has the potential to offend.
In June 2021, the Facebook page of a regional news outlet in Colombia shared a post by a verified Facebook page. The post contains a short video (originally shared on TikTok) with text expressing admiration for those in the video. The video shows a protest in Colombia, with people marching behind a banner that says “SOS COLOMBIA.” The protesters are singing and address the Colombian President, mentioning tax reform and the strike. As part of their chant, the protesters call the President an “hijo de puta” and say “deja de hacerte el marica en la tv.” Facebook translated these phrases as “son of a bitch” and “stop being the fag on tv.”
The content was viewed around 19,000 times and shared over 70 times. Fewer than five users reported the content. Facebook removed the share of the post under its Hate Speech policy. Under its Hate Speech Community Standard, Facebook takes down content that “describes or negatively targets people with slurs, where slurs are defined as words that are inherently offensive and used as insulting labels” on the basis of protected characteristics including sexual orientation. The word “m**ica” is on Facebook’s list of prohibited slur words.
The user submitted their appeal to the Board in Spanish. In the appeal, the page administrator states that they are a journalist reporting on local news from their province and that they aim to follow Facebook’s policies. They also note that this removal led to account penalties. The user stated that the video did not intend to cause harm and that it shows youth protesting within the framework of freedom of expression and peaceful protest. They note that the young people are expressing themselves without violence and demanding rights using typical language, and express concern about government repression of protest.
The Board would appreciate public comments that address:
- Whether Facebook’s decision to remove the post is consistent with the company’s Hate Speech Community Standard, specifically the rules against describing or negatively targeting people with slurs.
- Whether Facebook’s decision to remove the post is consistent with the company’s stated values and human rights responsibilities and commitments.
- The different usages and impacts of the word “m**ica” in Colombia, including in the context of criticism of political figures.
- Insights on the socio-political context in Colombia, including about the restriction of information on social media regarding the recent protests and criticism of political figures.
- How Facebook’s Spanish-language moderation differs across different Spanish-speaking countries.
- The availability of the newsworthiness allowance to local and regional news outlets.
- Whether sufficient detail is currently provided to people who use Facebook in Spanish whose content is removed for violating the Hate Speech policy.
In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
2021-011-FB-UA
User appeal to restore content to Facebook
Submit public comment here.
Note: To allow people to provide comments on the nature and impact of the post and help people understand the Board’s eventual ruling in this case, we are sharing some of the exact words used in this post. We do so in the interest of transparency while recognizing that some of the quoted language has the potential to offend.
In May 2021, a Facebook user who appears to be in South Africa posted in English in a public group described as focused on unlocking minds. The post discusses “multi-racialism” in South Africa, and states that poverty, homelessness, and landlessness have increased for black people in South Africa since 1994. It also states that white people hold and control the majority of wealth, and that wealthy black people may have ownership of some companies, but not control. The post then states that if “you think” sharing neighborhoods, language, and schools with white people makes you “deputy-white” then “you need to have your head examined.” The post concludes with “[y]ou are” a “sophisticated slave,” “a clever black,” “’n goeie kaffir” or “House nigger.”
The post received over 1,000 views and was shared over 40 times. The user’s Facebook profile picture and banner photo depict black people (the Board is not able to verify the identity or protected characteristics of users who appeal or report content).
Facebook removed the post under its Hate Speech policy the same day it was posted after it was reported by a user who appears to be located in South Africa. Under its Hate Speech Community Standard, Facebook takes down content that “describes or negatively targets people with slurs, where slurs are defined as words that are inherently offensive and used as insulting labels” on the basis of their race, ethnicity and/or national origin. Facebook also prohibits targeting people based on protected characteristics with generalizations about mental deficiencies or statements of inferiority. The Community Standard includes an exception to allow people to “share content that includes someone else's hate speech to condemn it or raise awareness” and to take into account that “speech that might otherwise violate our standards can be used self-referentially or in an empowering way.”
The user submitted their appeal to the Board in English. The user stated in their appeal that they want to understand why the post was removed. They noted that people should be allowed to share different views on the platform and “engage in a civil and healthy debate.” The user also stated that they “did not write about any group to be targeted for hatred or for its members to be ill-treated in any way by members of a different group.” They argued that their post instead “encouraged members of a certain group to do introspection and re-evaluate their priorities and attitudes.” The user also stated that there is nothing in the post or “in its spirit or intent” that would promote hate speech, and that it is unfortunate that Facebook is unable to tell them what part of their post is hate speech.
The Board would appreciate public comments that address:
- Whether Facebook’s decision to remove the post is consistent with the company’s Hate Speech Community Standard, specifically the rules against describing or negatively targeting people with slurs and generalizations about mental deficiencies or statements of inferiority.
- Whether Facebook’s decision to remove the post is consistent with the company’s stated values and human rights responsibilities and commitments.
- The usage and impact of the words included in this post in the South African context, including in discussions relating to the political, economic and social issues raised by the user.
- Content moderation challenges specific to South Africa, both in terms of respecting freedom of expression and addressing harms that may result from hate speech online.
- How Facebook should interpret the following concepts when enforcing its Hate Speech Community Standard: ‘self-referential,’ ‘empowering,’ ‘condemning,’ and ‘awareness raising.’
- Whether sufficient detail is currently provided to people who use Facebook in English whose content is removed for violating the "Hate Speech" policy.
- What information about users, including on protected characteristics, should be available to moderators when reviewing content, considering its possible relevance to enforcing the Community Standard on Hate Speech. The Board would also appreciate comments on whether Facebook can confirm user-provided information, and any privacy concerns these points might raise.
In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 15:00 UTC on Tuesday, July 27.
What’s Next
In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.