Upheld
Depiction of Zwarte Piet
The Oversight Board has upheld Facebook's decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard.
To read this decision in Dutch click here.
Als u deze beslissing in het Nederlands wilt lezen, klikt u hier.
Case summary
The Oversight Board has upheld Facebook’s decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard.
About the case
On December 5, 2020, a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long video on their timeline. The video showed a young child meeting three adults, one dressed to portray “Sinterklaas” and two portraying “Zwarte Piet,” also referred to as “Black Pete.”
The two adults portraying Zwarte Piets had their faces painted black and wore Afro wigs under hats and colorful renaissance-style clothes. All the people in the video appear to be white, including those with their faces painted black. In the video, festive music plays and one Zwarte Piet says to the child, “[l]ook here, and I found your hat. Do you want to put it on? You’ll be looking like an actual Pete!”
Facebook removed the post for violating its Hate Speech Community Standard.
Key findings
While Zwarte Piet represents a cultural tradition shared by many Dutch people without apparent racist intent, it includes the use of blackface which is widely recognized as a harmful racial stereotype.
Since August 2020, Facebook has explicitly prohibited caricatures of Black people in the form of blackface as part of its Hate Speech Community Standard. As such, the Board found that Facebook made it sufficiently clear to users that content featuring blackface would be removed unless shared to condemn the practice or raise awareness.
A majority of the Board saw sufficient evidence of harm to justify removing the content. They argued the content included caricatures that are inextricably linked to negative and racist stereotypes, and are considered by parts of Dutch society to sustain systemic racism in the Netherlands. They took note of documented cases of Black people experiencing racial discrimination and violence in the Netherlands linked to Zwarte Piet. These included reports that during the Sinterklaas festival Black children felt scared and unsafe in their homes and were afraid to go to school.
A majority found that allowing such posts to accumulate on Facebook would help create a discriminatory environment for Black people that would be degrading and harassing. They believed that the impacts of blackface justified Facebook’s policy and that removing the content was consistent with the company’s human rights responsibilities.
A minority of the Board, however, saw insufficient evidence to directly link this piece of content to the harm supposedly being reduced by removing it. They noted that Facebook’s value of “Voice” specifically protects disagreeable content and that, while blackface is offensive, depictions on Facebook will not always cause harm to others. They also argued that restricting expression based on cumulative harm can be hard to distinguish from attempts to protect people from subjective feelings of offense.
The Board found that removing content without providing an adequate explanation could be perceived as unfair by the user. In this regard, it noted that the user was not told that their content was specifically removed under Facebook’s blackface policy.
The Oversight Board’s decision
The Oversight Board upholds Facebook’s decision to remove the content.
In a policy advisory statement, the Board recommends that Facebook:
- Link the rule in the Hate Speech Community Standard prohibiting blackface to its reasoning for the rule, including the harms the company seeks to prevent.
- Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing, in line with the Board’s recommendation in case 2020-003-FB-UA. Where Facebook removes content for violating its rule on blackface, any notice to users should refer to this specific rule, and link to resources that explain the harm this rule seeks to prevent. Facebook should also provide a detailed update on its “feasibility assessment” of the Board’s prior recommendations on this topic.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
1. Decision summary
The Oversight Board has upheld Facebook’s decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard. A majority of the Board found that removing the content complied with Facebook’s Community Standards, its values and its international human rights responsibilities.
2. Case Description
On December 5, 2020, a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long-video on their timeline. The caption of the post, as translated into English, states “happy child!” and thanks Sinterklaas and Zwarte Piets. The video showed a young child meeting three adults, one dressed to portray “Sinterklaas” and two portraying “Zwarte Piet,” also referred to as “Black Pete.” The two adults portraying Zwarte Piets had their faces painted black, wore Afro wigs under hats and colorful renaissance-style clothes. All the adults and the child in the video appear to be white, including those with their faces painted black.
In the video, festive music plays in the background as the child shakes hands with Sinterklaas and one Zwarte Piet. The other Zwarte Piet places a hat on the child’s head and says to the child in Dutch: “[l]ook here, and I found your hat. Do you want to put it on? You’ll be looking like an actual Pete! Let me see. Look....”
The post was viewed fewer than 1,000 times. While the majority of users who viewed the post were from the Netherlands, including the island of Curaçao, there were also views by users from Belgium, Germany and Turkey. The post received fewer than 10 comments and had fewer than 50 reactions, the majority of which were “likes” followed by “loves.” The content was not shared by other users. The post was reported by a Facebook user in the Netherlands for violating Facebook’s Hate Speech Community Standard.
On December 6, 2020, Facebook removed the post for violating its Hate Speech Community Standard. Facebook determined that the portrayals of Zwarte Piet in the video violated its policy prohibiting caricatures of Black people in the form of blackface. Facebook notified the user that their post “goes against our Community Standards on Hate Speech.” After Facebook rejected the user’s appeal against their decision to remove the content, the user submitted their appeal to the Oversight Board on December 7, 2020.
3. Authority and scope
The Board has authority to review Facebook's decision under Article 2 (Authority to review) of the Board's Charter and may uphold or reverse that decision under Article 3, Section 5 (Procedures for review: Resolution of the Charter). Facebook has not presented reasons for the content to be excluded in accordance with Article 2, Section 1.2.1 (Content not available for Board review) of the Board's Bylaws, nor has Facebook indicated that it considers the case to be ineligible under Article 2, Section 1.2.2 (Legal obligations) of the Bylaws. Under Article 3, Section 4 (Procedures for review: Decisions) of the Board's Charter, the final decision may include a policy advisory statement, which will be taken into consideration by Facebook to guide its future policy development.
4. Relevant standards
The Oversight Board considered the following standards in its decision:
I. Facebook’s Community Standards
Facebook's Community Standards define hate speech as “a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.” Direct attacks include “dehumanizing speech” and “harmful stereotypes.” Under “Tier 1,” prohibited content (“do not post”) includes content targeting a person or group of people on the basis of a protected characteristic with “designated dehumanizing comparisons, generalizations, or behavioral statements (in written or visual form).” “Caricatures of Black people in the form of blackface” is specifically listed as an example of violating content.
In Facebook’s Hate Speech Community Standard, the company states that hate speech is not allowed on the platform "because it creates an environment of intimidation and exclusion and, in some cases, may promote real-world violence."
II. Facebook’s values
Facebook’s values are outlined in the introduction to the Community Standards. The value of “Voice” is described as “paramount”:
The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Facebook limits “Voice” in service of four values, and two are relevant here:
“Safety”: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.
“Dignity” : We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.
III. Human rights standards
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. The Board's analysis in this case was informed by the following human rights standards:
- Freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression reports: A/HRC/38/35 (2018) and A/74/486 (2019).
- The right to take part in cultural life: Article 15, International Covenant on Economic, Social and Cultural Rights ( ICESCR); General Comment No. 21, the Committee on Economic, Social and Cultural Rights (2009).
- The right to non-discrimination: Article 2, para. 1, ICCPR; Article 2, Convention on the Elimination of All Forms of Racial Discrimination ( CERD); General Recommendation No. 34, the Committee on the Elimination of Racial Discrimination (CERD Committee), (2011); Concluding Observations on the Netherlands ( CERD/C/NLD/CO/19-21), the CERD Committee, (2015); UN Special Rapporteur on racism, report A/HRC/44/57/Add.2 (2020); Report of the Working Group of Experts on People of African Descent (WGEPAD), A/HRC/30/56/Add.1 (2015).
- The right to health: Article 12, ICESCR.
- The rights of the child: Articles 2 and 6, Convention on the Rights of the Child ( CRC).
5. User statement
The user stated in their appeal to the Board that the post was meant for their child, who was happy with it, and that they want the content back up on Facebook. The user also stated that “the color does not matter” in this case because, in their view, Zwarte Piet is important to children.
6. Explanation of Facebook’s decision
Facebook removed this post as a Tier 1 attack under the Hate Speech Community Standard, specifically for violating its rule prohibiting harmful stereotypes and dehumanizing generalizations in visual form, which includes caricatures of Black people in the form of blackface. Facebook announced the “blackface” policy via its Newsroom and through the news media in August 2020. At the same time, the company updated its Hate Speech Community Standard to include the blackface policy. In November 2020, Facebook released a video in Dutch explaining the potential effects of this policy on the portrayal of Zwarte Piet on the platform.
Facebook also noted that the policy is the outcome of extensive research and external stakeholder engagement. As a result, Facebook concluded that the portrayals of Zwarte Piet “insult, discriminate, exclude, and dehumanize Black people by representing them as inferior and even subhuman” because the figure’s characteristics are “exaggerated and unreal.” Moreover, Facebook stated that Zwarte Piet is “a servile character whose typical behavior includes clumsiness, buffoonery, and speaking poorly.”
Facebook submitted that because “[t]he two people in the video were dressed in the typical Black Pete costume -- their faces were painted in blackface and they wore Afro-wigs,” its decision to remove the content was consistent with its blackface policy. Facebook also noted there was no indication the content was shared to condemn or raise awareness about the use of blackface, which is a general exception built into the Hate Speech Community Standard.
Facebook also submitted that its removal of the content was consistent with its values of “Dignity” and “Safety,” when balanced against the value of “Voice.” According to Facebook, the harms caused by portrayals of Zwarte Piet on its platform “even if intended to do no harm by the user, cause such extreme harm and negative experience that they must be removed.”
Facebook further stated that its decision to remove the content was consistent with international human rights standards. Facebook stated that (a) its policy was clearly and easily accessible, (b) the decision to remove the content was legitimate to protect the rights of others from harm and discrimination, and (c) its decision was “necessary to prevent harm to the dignity and self-esteem of children and adults of African descent.” In order to meet the requirement of proportionality for restrictions on expression, Facebook argued its policy applied to a narrow set of “the most egregious stereotypes.”
7. Third-party submissions
The Oversight Board received 22 public comments related to this case. Seven of the comments were submitted from Europe and 15 from the United States and Canada. The submissions covered themes including: the history of Zwarte Piet, whether the character's portrayal is harmful to Black people, especially Black children, and how Facebook's Hate Speech Community Standard relates to this case and its compliance with international human rights standards.
To read public comments submitted for this case, please click here.
8. Oversight Board analysis
This case presents several tensions for the Board to grapple with, because it involves a longstanding cultural tradition shared and enjoyed by many Dutch people without apparent racist intent. The tradition, however, includes people in blackface, which is widely recognized around the globe, and even increasingly in the Netherlands, as a harmful racial stereotype. In this case, a user objects to Facebook’s removal of a family video shared with a relatively small audience, celebrating a festive tradition with a child. It features the character Zwarte Piet in blackface. This is a form of expression Facebook recently chose to prohibit based on its values of “Voice,” “Safety” and “Dignity.” There is no suggestion that the user intended to cause harm and they do not feel this was hate speech. At the same time, many people, including academics, social and cultural experts, public authorities as well as a growing number of national actors in the Netherlands, believe that the practice is discriminatory and can cause harm (evidence supporting this view is set out in section 8.3 below). Numerous human rights are implicated in this case beyond expression, including cultural rights, equality and non-discrimination, mental health, and the rights of children.
The Board seeks to evaluate whether this content should be restored to Facebook through three lenses: Facebook’s Community Standards; the company’s values; and its human rights responsibilities. The complexity of these issues allows reasonable people to reach different conclusions, and the Board was divided on this case.
8.1 Compliance with Community Standards
Facebook enforces its Community Standard on Hate Speech by identifying (i) a “direct attack” and (ii) a “protected characteristic” the direct attack was based upon. In this case, the Board agrees with Facebook that both elements required for enforcing the Community Standard were satisfied.
The policy rationale for the Hate Speech Community Standard lists “dehumanizing speech” and “harmful stereotypes” as examples of an attack. Under the “do not post” section, “designated dehumanizing comparisons, generalizations, or behavioral statements (in written or visual form)” are prohibited, expressly including “caricatures of Black people in the form of blackface.” The Hate Speech Community Standard includes race and ethnicity among the list of protected characteristics. In this case, Facebook notified the user that their content violated the Hate Speech Community Standard. However, the user was not informed that the post was specifically removed under the blackface policy.
The Board notes the user claimed their intent was to share a celebration of a festive tradition. The Board has no reason to believe this view was not sincerely held. However, the Hate Speech Community Standard, including the rule on blackface, does not require a user to intend to attack people based on a protected category. Facebook’s rule is structured to presume that any use of blackface is inherently a discriminatory attack. On this basis, Facebook’s action to remove this content was consistent with its content policies.
The Board notes that the Hate Speech Community Standard provides a general exception to allow people to “share content that includes someone else’s hate speech to condemn it or raise awareness.” They further state: “our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.” The Board agreed that this exception did not apply in this case.
A majority of the Board noted that the two adults in the video had their whole faces painted black, wore Afro wigs, colorful renaissance-style clothes and acted as servants of Sinterklaas. The majority also found that the content included potentially harmful stereotypes, such as servitude and inferiority. In light of this, as well as of the analysis of Facebook’s values and human rights responsibilities below, the majority affirms that removing the content was in line with Facebook’s Hate Speech Community Standard.
For a minority, however, Facebook’s general rule that blackface intimidates, excludes or promotes violence, raised concerns addressed below.
8.2 Compliance with Facebook’s values
For a majority of the Board, the decision to remove this content, and the prohibition on blackface, complied with Facebook’s values of “Voice,” “Safety” and “Dignity.”
The use of blackface, including portrayals of Zwarte Piet, is widely agreed to be degrading towards Black people. In this regard, the Board references reports by international human rights mechanisms as well as regional and national authorities, which are discussed in more detail under section 8.3(III.). The user’s content included caricatures that are inextricably linked to negative and racist stereotypes originating in the enslavement of Black people. In relation to the value of “Voice”: the user’s video is not political speech or a matter of public concern and is, on its own, purely private. These caricatures are considered by parts of Dutch society to sustain systemic racism in the Netherlands today. For the majority, it cannot be decisive that the user shared this content without malicious intent or hatred towards Black people. Allowing the accumulation of such posts on Facebook would create a discriminatory environment for Black people that would be degrading and harassing. At scale, the policy is clear and ensures Black people’s dignity, safety and voice on the platform. Restricting the voice of people who share depictions of blackface in contexts where it is not condemning racism is acceptable to achieve this objective.
A minority of the Board found that Facebook should have given greater weight to the user’s voice in this case, even if it is of a private nature. They recall that Facebook’s value of “Voice” specifically protects disagreeable and objectionable content. While blackface may offend, the minority believed that depictions on Facebook will not always cause harm to others, and exceptions to the Hate Speech Community Standard are too narrow to allow for these situations. In this case, the minority believed that during an apparently private occasion, the child was encouraged to identify themselves with Zwarte Piet and that the interaction could be regarded as positive. The minority therefore believes Facebook has presented insufficient evidence of harm to justify the suppression of “Voice.” In their view, the removal of this post, without notice of the specific rule violated, caused confusion for the user who posted it and did not advance the values of “Dignity” or “Safety.”
8.3 Compliance with Facebook’s human rights responsibilities
A majority found the removal of the user’s content under the Community Standard on Hate Speech was consistent with Facebook’s human rights responsibilities, in particular to address negative human rights impacts that can arise from its operations (UNGPs, Principles 11 and 13).
Human rights due diligence (UNGPs)
Facebook’s rule on blackface was the outcome of a wider process set up to build a policy on harmful stereotypes. This process involved extensive research and engagement with more than 60 stakeholders, including experts in a variety of fields, civil society groups, and groups affected by discrimination and harmful stereotypes.
For the majority, this was in line with international standards for on-going human rights due diligence to evolve the company’s operations and policies (Principle 17(c) and 18(b) UNGPs; UN Special Rapporteur on freedom of expression, report A/74/486, paras 44 and 58(e)). For the minority, Facebook provided insufficient information on the extent of research and stakeholder engagement in countries where the Sinterklaas tradition is present, such as the Netherlands.
Freedom of expression (Article 19 ICCPR)
Article 19, para. 2 of the ICCPR provides broad protection for expression of “all kinds.” The UN Human Rights Committee has made clear the protection of Article 19 extends to expression that may be considered “deeply offensive” (General Comment No. 34, paras. 11, 12).
The Board noted that the right to participate in cultural life, protected under Article 15 of the ICESCR, is also relevant. Participating in the Sinterklaas festival and posting related content on Facebook – including images of Zwarte Piet in blackface – could be understood as taking part in the cultural life of the Netherlands.
Both the right to freedom of expression and the right to participate in cultural life should be enjoyed by all without discrimination on grounds of race or ethnicity (Article 2, para. 1, ICCPR; Article 2, para. 2, IESCR).
While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook should seek to align its content moderation policies on hate speech with these principles (UN Special Rapporteur on freedom of expression, report A/74/486, at para. 58(b)). Likewise, the right to participate in cultural life may be subject to similar restrictions in order to protect other human rights (General Comment No. 21, para. 19).
I. Legality
The Board found that Facebook’s Hate Speech Community Standard was sufficiently clear and precise to put users on notice that content featuring blackface would be removed unless a relevant exception was engaged (General Comment No. 34, para. 25). Facebook further sought to raise awareness of the potential effects of this policy change in the Netherlands by releasing a video in Dutch ahead of the Sinterklaas festival in November 2020. This explained the reasons why portrayals of Zwarte Piet are not permitted on the platform.
II. Legitimate aim
The Board agreed the restriction pursued the legitimate aim of protecting the rights of others (General Comment No. 34, para. 28). These include the rights to equality and non-discrimination, including based on race and ethnicity (Article 2, para. 1, ICCPR; Article 2, ICERD). Facebook sought the legitimate aim of preventing discrimination in equal access to a platform for expression (Article 19 ICCPR), and to protect against discrimination in other fields, which in turn is important to protect the right to health of persons targeted by discrimination (Article 12, ICESCR), especially for children, who under the CRC receive additional protection against discrimination, and guarantees for their right to development (Articles 2 and 6 CRC).
The Board further agreed that it is not a legitimate aim to restrict expression for the sole purpose of protecting individuals from offense (UN Special Rapporteur on freedom of expression, report A/74/486, para. 24), as the value international human rights law placed on uninhibited expression is high (General Comment No. 34, para. 38).
III. Necessity and proportionality
Any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment 34, para. 34).
A majority of the Board considered Facebook’s Hate Speech Community Standard, and whether the application of the rule on blackface in this case was necessary to protect the rights of Black people to equality and non-discrimination, in particular for children. This was consistent with Facebook’s responsibility to adopt policies to avoid causing or contributing to adverse human rights impacts (UNGPs, Principle 13).
As the Board also identified in case decision 2020-003-FB-UA, moderating content to address the cumulative harms of hate speech, even where the expression does not directly incite violence or discrimination, can be consistent with Facebook’s human rights responsibilities in certain circumstances. For the majority, the accumulation of degrading caricatures of Black people on Facebook creates an environment where acts of violence are more likely to be tolerated and reproduce discrimination in a society. As with degrading slurs, context will always be important even for the enforcement of a general rule. Here the experience of discrimination against Black people in the Netherlands, and the connection of Zwarte Piet and blackface to that experience, was crucial.
As the UN Special Rapporteur on freedom of expression has observed, “the scale and complexity of [social media companies] addressing hateful expression presents long-term challenges and may lead companies to restrict such expression even if it is not clearly linked to adverse outcomes (as hateful advocacy is connected to incitement in article 20 of the [ICCPR]).” (report A/HRC/38/35, para. 28). The Special Rapporteur has also indicated companies may remove hate speech that falls below the threshold of incitement to discrimination or violence; when departing from the high standard states must meet to justify restrictions in the criminal or civil law on expression, companies must provide a reasoned explanation of the policy difference in advance, clarified in accordance with human rights standards (A/74/486, paras. 47 – 48).
The Board notes international human rights law would not allow a state to impose a general prohibition on blackface through criminal or civil sanctions, except under the conditions foreseen in ICCPR Article 20, para. 2 and Article 19, para. 3 (e.g., advocacy of hatred constituting incitement to violence) (A/74/486, para. 48). Expression that does not reach this threshold may still raise concern in terms of tolerance, civility and respect for others, but would not be necessary or proportionate for a state to restrict (Rabat Plan of Action, para. 12, 20). In the Board’s view, the individual post in this case would fall within this category of protection from state restriction.
The majority found Facebook followed international guidance and met its human rights responsibilities in this case. Numerous human rights mechanisms have found the portrayal of Zwarte Piet to be a harmful stereotype, connecting it to structural racism in the Netherlands, with severe harms at a societal and individual level. For the majority, this justified Facebook adopting a policy that departs from the human rights standards binding states, where the intent of the person sharing content featuring blackface is only material if they are condemning its use or raising awareness.
The CERD Committee observed in its ‘Concluding Observations on the Netherlands’ that Zwarte Piet “is experienced by many people of African descent as a vestige of slavery” and is connected to structural racism in the country ( CERD/C/NLD/CO/19-21, para. 15 and 17). The majority noted that the UN Working Group of Experts on People of African Descent has also reached similar conclusions ( A/HRC/30/56/Add.1, para. 106). The Board agrees with the CERD Committee that “even a deeply rooted cultural tradition does not justify discriminatory practices and stereotypes” ( CERD/C/NLD/CO/19-21, para. 18; see also, UN ESCR Committee, General Comment No. 21, paras 18 and 51).
The majority was also persuaded by the documented experiences of Black people in the Netherlands of racial discrimination and violence that were often linked to, and exacerbated by, the cultural practice of Zwarte Piet. The Dutch Ombudsman for Children’s finding that "portrayals of Zwarte Piet can contribute to bullying, exclusion and discrimination against Black children” along with reports that during the Sinterklaas festival Black children felt scared and unsafe in their homes and were afraid to go to school is persuasive. Additionally, the Board noted reported episodes of intimidation and violence against people peacefully protesting Zwarte Piet ( CERD/C/NLD/CO/19-21, para. 17). The Board noted also the work the European Commission against Racism and Intolerance ( ECRI report on the Netherlands, paras. 30-31), the Netherlands Institute for Human Rights and the European Commission’s network of legal experts in gender equality and non-discrimination ( European Commission Country Report Netherlands 2020, page 24, footnote 89).
The majority of the Board further noted that repeated negative stereotypes about an already marginalized minority, including in the form of images shared on social media, have a psychological impact on individuals with societal consequences. Repeated exposure to this particular stereotype may nurture in people who are not Black ideas of racial supremacy that may lead individuals to justification and even incitement of discrimination and violence. For Black people, the cumulative effect of repeated exposure to such images, as well as being on the receiving end of violence and discrimination, may impact self-esteem and health, in particular for children (Article 12, ICESCR; Articles 2 and 6, CRC). The Board notes the work of Izalina Tavares, “ Black Pete: Analyzing a Racialized Dutch Tradition Through the History of Western Creations of Stereotypes of Black Peoples”, in this regard.
Other academic studies have also drawn a causal connection between portrayals of Zwarte Piet and harm, several of which Facebook also included in its decision rationale to the Board. These include Judi Mesman, Sofie Janssen and Lenny van Rosmalen, “ Black Pete through the Eyes of Dutch Children,” and Yvon van der Pijl and Karina Gourlordava, “ Black Pete, “Smug Ignorance,” and the Value of the Black Body in Postcolonial Netherlands.” This fits within a broader literature on this topic, including John F. Dovidio, Miles Hewstone, Peter Glick, and Victoria M. Esses, “ Prejudice, Stereotyping and Discrimination: Theoretical and Empirical Overview.”
According to the majority, there is sufficient evidence of objective harm to individuals’ rights to distinguish this rule from one that seeks to insulate people from subjective offense.
The majority also found the removal to be proportionate. Less severe interventions, such as labels, warning screens, or other measures to reduce dissemination, would not have provided adequate protection against the cumulative effects of leaving this content of this nature on the platform. The challenge of assessing intent when enforcing against content at scale should also be considered. It would require a case-by-case examination that would give rise to a risk of significant uncertainty, weighing in favor of a general rule that can more easily be enforced (see, for a comparative perspective: European Court of Human Rights, Case of Animal Defenders International v. the United Kingdom, para. 108).
The majority further noted that the prohibition Facebook imposed is not blanket in nature, and that the availability of human review will be essential for accurate enforcement. There is an exception under the Hate Speech Community Standard that also applies to the blackface policy, allowing depictions of blackface to condemn or raise awareness about hate speech. The newsworthiness allowance further allows Facebook to permit violating content on the platform where the public interest in the expression outweighs the risk of harm (for example, if pictures or footage of a public figure in blackface were to become a topic of national news coverage).
Modified “Piet” traditions that have abandoned the use of blackface are also not affected by the Hate Speech Community Standard, and this was significant for the majority. The user can therefore adapt their tradition if they wish to share footage of it through their Facebook account. A growing number of national actors in the Netherlands have distanced themselves from and/ or promoted alternative and inclusive forms of the tradition ( European Race and Imagery Foundation report, pages 7, 24 and 56-58). Against the backdrop of a global reckoning with racism and white supremacy, it is consistent with Facebook’s human rights responsibilities to adopt operational rules and procedures that promote equality and non-discrimination.
While appreciating the arguments of the majority, the minority did not believe the requirements of necessity and proportionality had been met. They noted that the rule is unduly broad, and a more nuanced policy would allow Facebook to address well-placed concerns relating to discrimination, while avoiding collateral damage to expression that does not intend or directly cause harm. The minority believed that, while certainly relevant, the evidence presented was insufficient to demonstrate in precise terms a causative link between the expression under review, and the harm being prevented or reduced by limiting it (General Comment 34, para. 34): the policy should allow for the possibility that such expression will not always intend or contribute to harm. The minority noted that the excessive enforcement of the current policy is likely to have a chilling effect on freedom of expression. They also found that predicating content removal on the notion of cumulative harm makes restrictions of this sort difficult to distinguish from rules that seek to protect people from subjective feelings of offense. Likewise, the minority believed that a general negative psychological impact on individuals with societal consequences was not sufficiently demonstrated and would not justify interference with speech, unless it reaches the threshold of incitement (Article 20, para. 2), ICCPR), under international human rights law. They also expressed concern that Facebook’s power may be exercised in a way that interferes with a matter under national discussion and may distort or even supplant processes in a democratic society that would counter discrimination.
For the minority, removing potentially discriminatory content at scale where the user does not intend harm and where harm is unlikely to result, will not effectively address racial discrimination. They agreed with the majority that removing content without providing the user with an adequate explanation could be perceived as unfair. The confusion that may result from being accused of “attack” and “hate speech” where no harm was intended, could undermine efforts on and off Facebook to bring awareness and clarity about Facebook’s content policies to people. For the majority, this would be addressed, and the platform made more inclusive, if content removal notices provided more information to the user on the justification for the rule enforced, including access to resources explaining the potential harms Facebook is seeking to mitigate.
9. Oversight Board decision
The Oversight Board upholds Facebook’s decision to remove the content.
10. Policy advisory statement
The following recommendations are numbered, and the Board requests that Facebook provides an individual response to each as drafted.
Explaining the blackface policy on Facebook to users
- Facebook should link the rule in the Hate Speech Community Standard prohibiting blackface to the company’s reasoning for the rule, including harms it seeks to prevent.
- In line with the Board’s recommendation in case 2020-003-FB-UA, Facebook should “ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing.” In this case any notice to users should specify the rule on blackface, and also link to above mentioned resources that explain the harm this rule seeks to prevent. Facebook should provide a detailed update on its “feasibility assessment” of the Board’s prior recommendations on this topic, including the specific nature of any technical limitations and how these can be overcome.
*Procedural note:
The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context.