Overturned

Nazi quote

The Oversight Board has overturned Facebook's decision to remove a post which the company claims violated its Community Standard on dangerous individuals and organisations.

Type of Decision

Standard

Policies and Topics

Topic
Politics
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
United States

Platform

Platform
Facebook

Case Summary

The Oversight Board has overturned Facebook’s decision to remove a post which the company claims violated its Community Standard on Dangerous Individuals and Organizations. The Board found that these rules were not made sufficiently clear to users.

About the case

In October 2020, a user posted a quote which was incorrectly attributed to Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. The quote, in English, claimed that, rather than appealing to intellectuals, arguments should appeal to emotions and instincts. It stated that truth does not matter and is subordinate to tactics and psychology. There were no pictures of Joseph Goebbels or Nazi symbols in the post. In their statement to the Board, the user said that their intent was to draw a comparison between the sentiment in the quote and the presidency of Donald Trump.

The user first posted the content two years earlier and was prompted to share it again by Facebook’s “memory” function, which allows users to see what they posted on a specific day in a previous year, with the option of resharing the post.

Facebook removed the post for violating its Community Standard on Dangerous Individuals and Organizations.

Key findings

In its response to the Board, Facebook confirmed that Joseph Goebbels is on the company’s list of dangerous individuals. Facebook claimed that posts which share a quote attributed to a dangerous individual are treated as expressing support for them, unless the user provides additional context to make their intent explicit.

Facebook removed the post because the user did not make clear that they shared the quote to condemn Joseph Goebbels, to counter extremism or hate speech, or for academic or news purposes.

Reviewing the case, the Board found that the quote did not support the Nazi party’s ideology or the regime’s acts of hate and violence. Comments on the post from the user’s friends supported the user’s claim that they sought to compare the presidency of Donald Trump to the Nazi regime.

Under international human rights standards, any rules which restrict freedom of expression must be clear, precise and publicly accessible, so that individuals can conduct themselves accordingly. The Board does not believe that Facebook’s rules on Dangerous Individuals and Organizations met this requirement.

The Board noted a gap between the rules made public through Facebook’s Community Standards and additional, non-public rules used by the company’s content moderators. In its publicly available rules, Facebook is not sufficiently clear that, when posting a quote attributed to a dangerous individual, the user must make clear that they are not praising or supporting them.

Facebook’s policy on Dangerous Individuals and Organizations also does not provide clear examples that explain the meaning of terms such as “praise” and “support,” making it difficult for users to understand this Community Standard.

While Facebook confirmed to the Board that Joseph Goebbels is designated as a dangerous individual, the company does not provide a public list of dangerous individuals and organizations, or examples of these. The Board also notes that, in this case, the user does not seem to have been told which Community Standard their content violated.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.

In a policy advisory statement, the Board recommends that Facebook:

  • Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing.
  • Explain and provide examples of the application of key terms from the Dangerous Individuals and Organizations policy, including the meanings of “praise,” “support” and “representation.” The Community Standard should also better advise users on how to make their intent clear when discussing dangerous individuals or organizations.
  • Provide a public list of the organizations and individuals designated as ‘dangerous’ under the Dangerous Individuals and Organizations Community Standard or, at the very least, a list of examples.

*Case summaries provide an overview of the case and do not have precedential value.

Full Case Decision

1.Decision Summary

The Oversight Board has overturned Facebook’s decision to remove a post which the company claims violated its Community Standard on Dangerous Individuals and Organizations. The Board found that these rules were not made sufficiently clear to users.

2. Case Description

In October 2020, a user posted a quote which was incorrectly attributed to Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. The quote, in English, claimed that there is no point in appealing to intellectuals, as they will not be converted and, in any case, yield to the stronger man in the street. As such, the quote stated, arguments should appeal to emotions and instincts. It ended by claiming that truth does not matter and is subordinate to tactics and psychology. There were no pictures of Goebbels or Nazi symbols in the post. The user first posted the content two years earlier and was prompted to share it again by Facebook’s “memory” function, which allows users to see what they posted on a specific day in a previous year, with the option of resharing the post. There were no user reports of the content. Facebook removed the post for violating the Community Standard on Dangerous Individuals and Organizations.

The post comprised the quote and attribution to Goebbels alone. There was no additional commentary within the post indicating the user's intent in sharing the content. In their statement to the Board, they explained that their quote involved important social issues and that the content of the quote was “VERY IMPORTANT right now in our country as we have a ‘leader’ whose presidency is following a fascist model.” Their intent was to draw a comparison between the sentiment in the quote and the presidency of Donald Trump. The comments on the post suggest that the user’s friends understood this to be the case.

3. Authority and Scope

The Board has the authority to review Facebook’s decision under Article 2 (Authority to Review) of the Board’s Charter and may uphold or reverse that decision under Article 3, Section 5 (Procedures for Review: Resolution) of the Charter. Facebook has not presented reasons for the content to be excluded in accordance with Article 2, Section 1.2.1 (Content Not Available for Board Review) of the Board’s Bylaws, nor has Facebook indicated that it considers the case to be ineligible under Article 2, Section 1.2.2 (Legal Obligations) of the Bylaws.

4.Relevant Standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards:

The Community Standard on Dangerous Individuals and Organizations states that “in an effort to prevent and disrupt real-world harm, we do not allow any organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.” It further states that Facebook will “remove content that expresses support or praise for groups, leaders or individuals involved in these activities”.

II. Facebook’s Values:

The Facebook values relevant to this case are outlined in the introduction to the Community Standards. The first is “Voice”, which is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Facebook limits "Voice” in service of four other values. The Board considers that the value of “Safety” is relevant to this decision:

Safety: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.

III. Relevant Human Rights Standards considered by the Board:

The UN Guiding Principles on Business and Human Rights ( UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. Drawing upon the UNGPs, the following international human rights standards were considered in this case:

5. User Statement

The user says that they first posted this content two years ago and were prompted to post it again by Facebook’s “memory” function. They explain that the post is important as the United States has a leader whose presidency is following a fascist model. They further state that their ability to use Facebook was restricted after they posted the content.

6. Explanation of Facebook’s Decision

Facebook states that it treats content that quotes, or attributes quotes (regardless of their accuracy), to a designated dangerous individual as an expression of support for that individual unless the user provides additional context to make their intent explicit. It says that in this case, the user provided no additional context indicating that the quote was shared to condemn Goebbels, to counter extremism or hate speech, or as part of an academic or newsworthy discourse. Facebook says that it would not have removed this content if the user’s post made clear that it was shared for these reasons. Although comments made by others on the user’s post indicated that they did not intend to praise or support Joseph Goebbels, Facebook explained that they only review the post itself when making a moderation decision. The content was not removed when originally posted because there were no user reports against it and it was not automatically detected.

The Board also notes that, when Facebook informed the user that their post had been removed, the company did not tell them which Community Standard their post had violated.

7. Third party submissions

The Oversight Board considered 12 public comments related to this case. Three of the comments were submitted from Europe and nine from the United States and Canada region.

The submissions covered the following themes: compliance with the relevant Community Standards; whether this constitutes political speech; the role of Facebook’s “memory” function; the effect of sanctions on users; and feedback on improving the public comment process.

8. Oversight Board Analysis

8.1 Compliance with Community Standards

The Board finds that Facebook’s decision to remove the user’s post does not comply with the Community Standard on Dangerous Individuals and Organizations .

Facebook says that to prevent and disrupt real-world harm, it prohibits organizations and individuals (living or deceased) involved in organized hate from having a presence on Facebook. It also prohibits content that expresses support or praise for such groups, their leaders, or individuals involved in these activities. Facebook does not publish a list of individuals and organizations whom it has designated as dangerous.

In the decision rationale it provided to the Board, Facebook clarified certain aspects of the Dangerous Individuals and Organizations policy that are not outlined in the Community Standards. First, Facebook confirmed that the Nazi party (the national Socialist German Workers’ Party, active between 1920 and 1945) has been designated as a hate organization since 2009 by Facebook internally. Joseph Goebbels, as one of the party’s leaders, is designated as a dangerous individual. Second, Facebook treats all content that supposedly quotes a designated dangerous individual as an expression of praise or support for that individual unless the user provides additional context to make their intent explicit. Third, Facebook determines compliance with the policy solely based on the text and/or imagery within the post itself, without assessing reactions or comments to the post.

In this case, the content involved a single quote attributed to Joseph Goebbels. The Board finds that the quote did not promote the ideology of the Nazi party and did not endorse the regime’s acts of hate and violence. Comments on the post from the user’s friends appear to support the user’s claim that the post sought to draw comparisons between the presidency of Donald Trump and the Nazi regime.

The Board notes an information gap between the publicly available text of the Dangerous Individuals and Organizations policy and the additional internal rules applied by Facebook’s content moderators. The public text is not sufficiently clear that when posting a quote attributed to a dangerous individual, the user must provide additional context in their post to make it explicit that they are not praising or supporting an individual or organization involved in organized hate. The Community Standards state a similar requirement for posts including symbols of designated organizations and individuals, but do not state the same for content praising or supporting them. As illustrated by this case, this results in speech being suppressed which poses no risk of harm. While the Board appreciates the importance of combatting the spread of Nazi ideology and hate speech, as well as the difficulty of pursuing such aims at scale, in this case the removal of the post clearly falls outside of the spirit of the policy.

8.2 Compliance with Facebook Values

The Board finds that the removal does not comply with Facebook’s values. When considering content removed under the Dangerous Individuals and Organizations policy, the value of “Safety” is balanced against the “paramount” value of “Voice.” Facebook explains that “Safety” may be given more weight when content may lead to physical harm. In this case, however, considering the minimal benefit to the value of “Safety” from the user’s post, the Board finds that the removal unnecessarily undermined the value of "Voice.”

8.3 Compliance with Human Rights Standards

  1. Freedom of expression (Article 19 ICCPR)

Applying international human rights standards on the right to freedom of expression, the Board finds that the content must be restored. The value placed on the right to freedom of expression is particularly high in public debate about political figures, which was the subject of this post (ICCPR Article 19, para. 2, General Comment 34, para. 34).

The right to freedom of expression is not absolute. Any restriction of the right must, however, meet the requirements of legality, legitimate aim, and necessity and proportionality. Facebook’s removal of the content failed both the first and third parts of this test.

a. Legality

Any rules restricting expression must be clear, precise and publicly accessible (General Comment 34, para. 25) to allow individuals to change their conduct accordingly. Facebook’s policy on Dangerous Individuals and Organizations falls short of the standard of legality. The policy lacks clear examples that explain the application of “support,” “praise” and “representation,” making it difficult for users to understand this Community Standard. This adds to concerns around legality and may create a perception of arbitrary enforcement among users. The Board is also concerned that in this case the user does not appear to have been informed which Community Standard they violated when their content was removed.

Facebook also fails to provide a list of individuals and organizations designated as dangerous, or, at the least, examples of groups or individuals that are designated as dangerous. Lastly, the policy fails to explain how it ascertains a user’s intent, making it hard for users to foresee how and when the policy will apply and conduct themselves accordingly.

b. Legitimate aim

Article 19, para. 3 of the ICCPR states that legitimate aims include respect for the rights or reputations of others, as well as the protection of national security, public order, or public health or morals. Facebook’s rationale indicates that the aim of the Dangerous Individuals and Organizations policy in relation to what it terms as “hate organizations” is to protect the rights of others. The Board is satisfied that the specific provisions on “hate organizations” aim to protect individuals from discrimination and protect them from attacks on life or foreseeable intentional acts resulting in physical or mental injury.

c. Necessity and Proportionality

Any restriction “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment 34, para. 34).

Context is key for assessing necessity and proportionality. The Board notes that there has reportedly been a global rise in support and acceptance of neo-Nazi ideology, and the challenge Facebook faces in restricting the presence of “hate organizations” on the platform (Report A/HRC/38/53, 2018). The Board considers that it may be necessary, when moderating content about dangerous organizations at scale, to remove posts where there is insufficient context. In this case, the content of the quote and other users’ responses to it, the user’s location and the timing of the post during an election campaign are all relevant. Facebook’s approach requiring content moderators to review content without regard to these contextual cues resulted in an unnecessary and disproportionate restriction on expression.

d. Equality and non-discrimination

Any restrictions on expression must respect the principle of equality and non-discrimination (General Comment 34, paras. 26 and 32). The Board recognizes the importance of Facebook combatting Nazi ideology on the platform, particularly in the context of documented increases in support for such ideas and anti-Semitism around the world. However, removing content that sought to criticize a politician by comparing their style of governance to architects of Nazi ideology does not promote equality and non-discrimination.

9. Oversight Board Decision

9.1 Content Decision

The Oversight Board overturns Facebook’s decision to take down the content, requiring the post to be restored.

9.2 Policy Advisory Statement

The Board recommends that Facebook:

  • Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing (e.g. for support of a hate organization).
  • Explain and provide examples of the application of key terms used in the Dangerous Individuals and Organizations policy, including the meanings of “praise,” “support” and “representation.” These should align with the definitions used in Facebook’s Internal Implementation Standards. The Community Standard should provide clearer guidance to users on how to make their intent apparent when discussing individuals or organizations designated as dangerous.
  • Provide a public list of the organizations and individuals designated “dangerous” under the Dangerous Individuals and Organizations Community Standard. At a minimum, illustrative examples should be provided. This would help users to better understand the policy and conduct themselves accordingly.

*Procedural note:

The Oversight Board’s decisions are prepared by panels of five Members and must be agreed by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

Return to Case Decisions and Policy Advisory Opinions