Multiple Case Decision

Greek 2023 Elections Campaign

The Oversight Board reviewed two Facebook posts together, both shared around the time of Greece’s June 2023 General Election. The Board has upheld Meta’s decisions to remove the content in both cases for violating the company’s Dangerous Organizations and Individuals policy.

2 cases included in this bundle

Upheld

FB-368KE54E

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Elections
Standard
Dangerous individuals and organizations
Location
Australia,Greece
Date
Published on March 28, 2024
Upheld

FB-3SNBY3Q2

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Elections
Standard
Dangerous individuals and organizations
Location
Greece
Date
Published on March 28, 2024

To read this decision in Greek, click here.

Για να διαβάσετε αυτήν την απόφαση στα ελληνικά, κάντε κλικ εδώ.

Summary

In reviewing two cases about Facebook content posted around the time of the June 2023 General Election in Greece, the Board has upheld Meta’s removal of both posts. Both were removed for violating the company’s Dangerous Organizations and Individuals policy. The first case involved an electoral leaflet that included a statement in which a lawful candidate aligned himself with a designated hate figure, while in the second case an image of a designated hate entity’s logo was shared. The majority of the Board find these removals to be consistent with Meta’s human rights responsibilities. However, the Board recommends that Meta clarify the scope of the policy’s exception allowing content to be shared in the context of “social and political discourse” during elections.

About the Cases

These two cases involve content posted on Facebook by different users around the time of the June 2023 General Election in Greece.

In the first case, a candidate for the Spartans party in Greece posted an image of their electoral leaflet. On it, there is a statement that Mr. Ilias Kasidiaris – a Greek politician sentenced to 13 years in prison for directing the criminal activities and hate crimes of Golden Dawn – supports the Spartans.

Mr. Kasidiaris and other members of the far-right Golden Dawn party had been persecuting migrants, refugees and other minority groups in Greece before the party was declared a criminal organization in 2020. Ahead of his sentencing in 2020, Mr. Kasidiaris founded a new political party called National Party – Greeks. Later, in May 2023, the Greek Supreme Court disqualified National Party – Greeks from running in the 2023 elections since, under Greek law, parties with convicted leaders are banned from participating. Although Mr. Kasidiaris has been banned from Facebook since 2013 for hate speech, he uses other social media platforms in prison. This is how he declared his support for the Spartans about a couple of weeks before the June election. The Spartans, which won 12 seats, acknowledged the part that Mr. Kasidiaris played in driving its party’s success.

In the second case, another Facebook user posted an image of the logo of National Party – Greeks, which also includes the Greek word for “Spartans.”

Golden Dawn, National Party – Greeks and Mr. Kasidiaris are designated as Tier 1 hate organizations and a Tier 1 hate figure respectively, under Meta’s Dangerous Organizations and Individuals policy.

Both posts were reported to Meta. The company determined separately that both posts violated its Dangerous Organizations and Individuals Community Standard, removed the content and applied a severe strike and 30-day restriction to both accounts. The two different Facebook users who posted the content appealed to Meta, but the company again found it to be violating. Both users then appealed separately to the Board.

Key Findings

First Case

The majority of the Board find the post violated the Dangerous Organizations and Individuals policy (as written in June 2023) because the user broke the rule that prohibits “praise” of a designated entity. He did this by “ideologically aligning” himself with Mr. Kasidiaris, who is designated by Meta as a hate figure. As this rule included an explicit example of ideological alignment, this would have been sufficiently clear to users and content moderators. Even after the latest policy update, this post would still fall under the prohibition on “positive references” to Mr. Kasidiaris.

Furthermore, the majority of Board Members note that removing this post did not infringe on the public’s right to know about this endorsement. The public had plentiful other opportunities, including in local and regional media, to learn about this expression of support by Mr. Kasidiaris for the Spartans party.

A minority, however, find that violation of the rule on ideological alignment was not directly obvious because Mr. Kasidiaris was endorsing the lawful candidate, not vice versa. These Board Members also believe the exception for “newsworthiness” should have been applied to keep this content on Facebook so that voters could have access to the fullest possible information on which to make their decisions.

Second Case

The majority of the Board find the image violated the Dangerous Organizations and Individuals policy because it shared a symbol of National Party – Greeks, a designated organization, and should have been removed. No context was provided by the user to allow for the exceptions on “reporting on, neutrally discussing or condemning” to be applied.

However, there are also Board Members in the minority who believe simply sharing logos associated with a designated entity, when there are no other violations or context of harmful content, should be allowed.

Overall Concerns

In the Board’s view, the policy exception for “social and political discourse” about designated entities during elections needs to be made clearer publicly. The Board also remains concerned about the lack of transparency around Meta’s designation of hate entities, which makes it challenging for users to understand which organizations or individuals they are allowed to align with ideologically or whose symbols they can share.

The Oversight Board’s Decision

The Oversight Board has upheld Meta’s decisions to remove both posts.

The Board recommends that Meta:

  • Clarify the scope of the Dangerous Organizations and Individuals Community Standard exception that allows for content “reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities” to be shared in the context of “social and political discourse.” Specifically, Meta should clarify how this exception applies to election-related content.

*Case summaries provide an overview of cases and do not have precedential value.

Full Case Decision

1. Decision Summary

The Oversight Board reviewed together two Facebook posts concerning the June 2023 General Election in Greece. The first case involves a Greek electoral candidate’s post, in which he shared details about his electoral campaign and an image of his electoral leaflet that featured an endorsement by a politician designated as a hate figure under Meta’s Dangerous Organizations and Individuals Community Standard. The second case concerns a post sharing the logo of a Greek party, National Party – Greeks, which is also a designated entity, with the word “Spartans” in Greek as part of the image. Meta removed both for violating Meta’s Dangerous Organizations and Individuals Community Standard.

The majority of the Board uphold Meta’s decisions to remove the content in both cases, finding that these removals conformed with Meta’s policies and human rights responsibilities. The Board recommends that Meta clarify the scope of its new “social and political discourse” exception to its Dangerous Organizations and Individuals Community Standard in elections.

2. Case Description and Background

These cases involve content posted on Facebook by different users in Greece around the time of the June 2023 General Election, the second set of elections to take place in the country that year following the failure of any party to secure a majority in elections in May.

In the first case, a Facebook user, who was a candidate for the Spartans party in Greece, posted an image of his electoral leaflet, containing his photo and name, along with a caption in Greek describing his campaign’s progress ahead of the elections, including his preparations and engagement with the public. The leaflet included a statement that Mr. Ilias Kasidiaris supports the Spartans.

Mr. Kasidiaris, a Greek politician, was sentenced to 13 years in prison for directing the activities of Golden Dawn. Golden Dawn was declared a criminal organization in 2020 for its responsibility for hate crimes, including the murder of a Greek rap singer. In 2013, two Golden Dawn members were found guilty of murdering a Pakistani migrant worker. Mr. Kasidiaris and other Golden Dawn members had been actively engaged in persecuting migrants, refugees and other minority and vulnerable groups. During a 2012 Golden Dawn rally, Mr. Kasidiaris called the Roma community “human trash” and asked his supporters to “fight [...] if they wanted their area to become clean,” (see public comments, e.g., PC-20008 from ACTROM - Action for and from the Roma).

Before being sentenced in 2020, Mr. Kasidiaris founded a new political party called National Party – Greeks. On May 2, 2023, the Greek Supreme Court disqualified National Party – Greeks from running in the 2023 general elections in the light of recently adopted amendments to the Greek constitution that bans parties with convicted leaders from participating in elections. Several international and regionalmedia outlets reported that ahead of the June 2023 elections, Mr. Kasidiaris had declared his support for the Spartans from prison using his social media accounts. Mr. Kasidiaris, who was banned from Facebook in 2013 for hate speech, mainly uses other social platforms now.

In the second case, a different Facebook user posted an image of the National Party – Greeks’ logo, which also includes the Greek word that translates as “Spartans.”

The Spartans party was founded in 2017 by Vasilis Stigkas and, according to the European Center for Populism Studies, promotes a far-right ideology and is a successor to the Golden Dawn party. The Spartans did not run in the May 2023 elections, but the party did apply to participate in the second set of elections in June that year. Greek law requires political parties to submit applications in order to participate in the national parliamentary elections, which subsequently have to be certified by a court. On June 8, 2023, the Greek Supreme Court issued a decision allowing 26 parties, four alliances and two independent candidates to participate in the June 2023 election, including Spartans. Mr. Stigkas, who won one of 12 seats for the Spartans party (4.65%), stated that Mr. Kasidiaris’ support “drove their success.”

Civic space in Greece has been marked by increasing threats and attacks perpetrated by extremist groups and private individuals, who target the human rights of refugees, migrants, LGBTQIA+ communities and religious minorities. Scholars of Greek politics, human rights defenders and local NGOs are concerned that far-right groups, including those affiliated with Golden Dawn, use mainstream social media platforms to spread misinformation and hate speech, actively operating online and offline, with their impact extending beyond what is visible on platforms such as Facebook (see public comments, e.g., PC-20017 from Far Right Analysis Network).

Freedom House’s annual Freedom in the World (2023) report ranked Greece as Free with a score of 86/100, noting that the media environment remains highly free and non-governmental organizations generally operate without interference from the authorities. Still, recent studies published by Reuters Institute for the Study of Journalism, International Press Institute and the Incubator for Media Education and Development highlight a significant decline of trust in Greek media, in particular in journalists and broadcasting media. This is largely due to concerns about political and business influence on journalism, coupled with the increasing digital spread of media. These studies also reveal concerns about manipulation of information, censorship and the decrease of media independence.

Both posts were reported to Meta, which, after human review, determined the content in both cases violated Facebook’s Dangerous Organizations and Individuals Community Standard. It applied a severe strike and 30-day restriction to both accounts, preventing them from using live video and ad products, without suspending the accounts. Both Facebook users who posted the content appealed, but Meta again found the content to be violating. The two users then separately appealed to the Board.

3. Oversight Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1). When the Board identifies cases that raise similar issues, they may be assigned to a panel as a bundle to deliberate together. A binding decision will be made with regard to each piece of content.

The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4). When Meta commits to act on recommendations, the Board monitors their implementation.

4. Sources of Authority and Guidance

The following standards and precedents informed the Board’s analysis in this case:

I. Oversight Board Decisions

II. Meta’s Content Policies

The policy rationale for the Dangerous Organizations and Individuals Community Standard explains that in “an effort to prevent and disrupt real-world harm,” Meta does not “allow organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence” on its platforms. Meta assesses “these entities based on their behavior both online and offline – most significantly, their ties to violence.”

According to the policy rationale, organizations and individuals designated under Tier 1 of the Dangerous Organizations and Individuals Community Standard fall into three categories: terrorist organizations, criminal organizations and hate entities. Tier 1 focuses on entities that engage in serious offline harms, including “organizing or advocating for violence against civilians, repeatedly dehumanizing or advocating for harm against people based on protected characteristics, or engaging in systematic criminal operations.” The policy rationale notes that Tier 1 designations result in the most extensive enforcement as Meta believes these entities have “the most direct ties to offline harm.”

Meta defines a “hate entity” as an “organization or individual that spreads and encourages hate against others based on their protected characteristics.” Meta states that the entity’s activities are characterized “by at least some of the following behaviors: violence, threatening rhetoric, or dangerous forms of harassment targeting people based on their protected characteristics; repeated use of hate speech; representation of hate ideologies or other designated hate entities; and/or glorification or support of other designated hate entities or hate ideologies.”

Under Tier 1 of the Dangerous Organizations and Individuals policy as in force in June 2023, Meta did not allow “leaders or prominent members of these organizations to have a presence on the platform, symbols that represent them to be used on the platform or content that praises them or their acts.” At that time, “praise” was defined as any of the following: “speak positively about a designated entity or event” or “aligning oneself ideologically with a designated entity or event.” Following December 2023 updates to the Dangerous Organizations and Individuals policy, the company now removes “glorification, support and representation of Tier 1 entities, their leaders, founders or prominent members, as well as unclear references to them.” This includes “unclear humor, captionless or positive references that do not glorify the designated entity’s violence or hate.”

Meta requires users to clearly state their intent when sharing content that discusses designated entities or their activities. The Dangerous Organizations and Individuals policy allows users to report on, neutrally discuss or condemn designated organizations or individuals or their activities. Meta updated this exception in August 2023 to clarify that users may share content referencing dangerous organizations and individuals or their activities in the context of “social and political discourse.” As Meta publicly announced in a newsroom blog post, the updated “social and political discourse” exception includes content shared in the context of elections.

The Board’s analysis of the content policies was also informed by Meta’s value of voice, which the company describes as “paramount,” as well as its value of safety.

Newsworthiness Allowance

Meta defines the newsworthiness allowance as a general policy exception that can be applied across all policy areas within the Community Standards, including to the Dangerous Organizations and Individuals policy. It allows otherwise violating content to be kept on the platform if the public interest value in doing so outweighs the risk of harm. According to Meta, such assessments are made only in “rare cases,” following escalation to its Content Policy team. This team assesses whether the content in question surfaces an imminent threat to public health or safety or gives voice to perspectives currently being debated as part of a political process. This assessment considers country-specific circumstances, including whether elections are underway. While the speaker's identity is a relevant consideration, the allowance is not limited to content posted by news outlets.

III. Meta’s Human Rights Responsibilities

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, in which it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The Board’s analysis of Meta’s human rights responsibilities in this case was informed by the following international standards:

  • The rights to freedom of opinion and expression: Articles 19 and 20, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee (2011); Joint Declaration on freedom of expression and elections in the digital age, UN Special Rapporteur on Freedom of Opinion and Expression, OSCE Representative on Freedom of the Media and OAS Special Rapporteur on Freedom of Expression (2022); report of the UN Special Rapporteur on Freedom of Opinion and Expression, A/HRC/28/25 (2018).
  • The right to freedom of association: Article 22, ICCPR; Reports of the UN Special Rapporteur on freedom of peaceful assembly and of association, A/68/299 (2013), A/HRC/26/30 (2014);
  • The right to life: Article 6, ICCPR;
  • The right to participate in public affairs and the right to vote: Article 25, ICCPR;
  • The right to non-discrimination: Articles 2 and 26, ICCPR;
  • The right to be free from torture, inhuman and degrading treatment: Article 7, ICCPR;
  • The prohibition against destruction of rights: Article 5, ICCPR; Article 30, UDHR.

5. User Submissions

The author of each post in these two cases appealed Meta’s decision to remove their content to the Board.

In their submission to the Board, the user in the first case stated they were a candidate from a legitimate Greek political party participating in the Greek parliamentary elections and noted that as a result of the strike applied to their account, they were unable to manage their Facebook page.

The user in the second case claimed they shared the logo of the Spartans party, expressing their surprise about the removal of their post.

6. Meta’s Submissions

Meta told the Board that the decisions to remove the content in both cases were based on its Dangerous Organizations and Individuals Community Standard.

Meta informed the Board that Golden Dawn, National Party – Greeks and Mr. Kasidiaris are designated as Tier 1 Hate Organizations and as a Tier 1 Hate Figure respectively. The designation of National Party – Greeks occurred on May 5, 2023. In response to the Board’s questions, Meta noted that the company designates entities in an independent process based on a set of designation signals.

Meta stated that the Facebook user in the first case praised a designated entity by speaking positively about Mr. Kasidiaris. Expressing “ideological alignment” was listed as an example of prohibited praise. Meta explained that the post’s caption indicated the user was distributing leaflets in support of their own parliamentary campaign and their own party, the Spartans. However, the leaflet also stated that Mr. Kasidiaris “supports the party Spartans,” explicitly highlighting that Mr. Kasidiaris, a designated individual, had endorsed the user’s political party. For Meta, this user publicly aligned themselves with Mr. Kasidiaris by promoting the latter’s endorsement. Meta informed the Board that following the December 2023 update to the Dangerous Organizations and Individuals policy, the post in the first case would violate the rule prohibiting “positive references that do not glorify the designated entity's violence or hate.” The post did not contain any explicit glorification of Mr. Kasidiaris or his violent or hateful activities.

In the second case, Meta considered the sharing of the National Party – Greeks’ logo as praise for the party, which is a designated entity, without any accompanying explanatory caption, so it removed the content. Meta informed the Board that following the December 2023 update to the Dangerous Organizations and Individuals policy, the post in the second case would be removed as the user shared a reference (a symbol) of the National Party – Greeks without an accompanying explanatory caption, although it did not contain any explicit glorification of Mr. Kasidiaris or his violent or hateful activities.

Meta found that neither post would have benefited from the Dangerous Organizations and Individuals exception in force at the time in June 2023 as neither user clearly indicated their intent to “report on, neutrally discuss or condemn” a designated entity or their actions.

According to Meta, this remained the case after the August 2023 changes to that exception, which reframed the exception as permitting “social and political discourse.” In response to the Board’s questions, Meta stated that the “social and political discourse” exception was introduced to permit some types of “content containing explicit context relating to a set of defined categories such as elections,” which it would have previously removed under the policy. When a designated entity is officially registered and enrolled in a formal electoral process, Meta was concerned that by removing all praise or references of the entity, this would unduly restrict people’s ability to discuss the election and candidates. However, the exception was never intended to encompass substantive support such as providing tangible operational or strategic advantage to a designated entity by distributing official campaign material, official propaganda or allowing official channels of communication on their behalf.

In response to a question from the Board, Meta explained the social and political discourse exception attempts to strike a balance between allowing discussion of designated entities participating in an election while preserving safety by removing substantive support for or glorification of these entities. Meta noted that it intentionally focused the allowance on entities that are registered and formally enrolled in the election process. This is because the allowance aims to permit discussion of candidates who are running for office, while removing glorification of a designated entity’s hate or violence or providing any substantive support to a designated entity. Meta added that “the purpose of creating this allowance was to enable users to express their opinion about their electoral preferences if the designated entity was running in elections, not to allow designated entities to circumvent existing electoral processes and the company’s enforcement to share their agendas.”

For the second case, Meta concluded that the social and political discourse exception under its updated policy would not apply because sharing a symbol or logo of National Party – Greeks with text that identifies Spartans, without additional commentary (e.g., a caption condemning or neutrally discussing National Party – Greeks), does not clearly indicate the user’s intent. Moreover, the exception also did not apply in the second case as the National Party – Greeks, a designated entity, was disqualified from participating in the Greek elections.

The Board asked Meta five questions in writing. Questions related to the application of Meta’s “social and political discourse” allowance under the Dangerous Organizations and Individuals policy; the transparency of the designation process and the list of designated entities under the policy. Meta answered the five questions.

7. Public Comments

The Oversight Board received 15 public comments that met the terms for submission. Thirteen were submitted from Europe and two from the United States and Canada. To read the public comments submitted with consent to publish, click here.

The submissions covered the following themes: the political context in Greece, including discussion of Greek political parties; 2023 elections in Greece and the impact of social media on election results; far right and extremist groups in Greece and other European countries, and their use of social-media platforms; recent legislative amendments in Greece and their impact on 2023 elections; and the importance of the transparency of entity lists under Meta’s Dangerous Organizations and Individuals policy.

8. Oversight Board Analysis

The Board selected these cases to assess the impact of Meta’s Dangerous Organizations and Individuals Community Standard on freedom of expression and political participation, especially during elections when designated entities or persons associated with them may be active in political discourse. The cases fall under the Board’s strategic priorities of Elections and Civic Space and Hate Speech Against Marginalized Groups. The Board examined whether this content should be restored by analyzing Meta’s content policies, human rights responsibilities and values.

8.1 Compliance With Meta’s Content Policies

The Board upholds Meta’s decisions to remove the content in both cases.

First Case: An Electoral Candidate’s Campaign Leaflet

The Board notes that Meta’s commitment to voice is paramount and is of heightened importance in electoral contexts. The Board emphasizes that to provide voters with access to the fullest information to cast their vote, Meta should allow public discourse among the electorate, candidates and parties on the activities of designated entities.

The Board finds that this post fell under Meta’s prohibition of “praise” of a designated entity that was in force in June 2023 because the user ideologically aligned themselves with Mr. Kasidiaris, a designated hate figure under Tier 1 of the Dangerous Organizations and Individuals policy. This was clearly described in the relevant Community Standard as conduct that Meta considers to be an example of prohibited “praise.” Following the December 30, 2023, policy changes, the content would fall under the prohibition on positive references to a designated entity that do not glorify the designated entity’s violence or hate.

For a minority of Board Members, the application of the rule on ideological alignment was not directly obvious because Mr. Kasidiaris was endorsing (i.e., “praising” or “referencing”) the user, rather than vice versa. It requires some level of inference that the user was in effect reciprocating that endorsement, and thus fell afoul of Meta’s policy on ideological alignment.

A minority of the Board consider that while this post violated the Dangerous Organizations and Individuals policy and did not fall under any policy exception in force in June 2023, Meta should have applied its newsworthiness allowance to keep this content on the platform, given that the public interest in the post outweighed the risk of harm. The post directly informed voters about a convicted criminal’s endorsement of an electoral candidate, which is relevant and valuable information in the electoral context, especially during the second set of elections, given the participation of a new party. These Board Members note that following the August 2023 updates to the Dangerous Organizations and Individuals policy, under the “social and political discourse” exception, Meta should allow lawful candidates in elections to express in neutral terms their ideological alignment with designated entities, absent any inclusion of hate speech or incitement of specific harm. This will enable voters to have the fullest possible information on which to make a decision.

Second Case: The Logo of National Party – Greeks and the Slogan “Spartans”

The majority of the Board find that the content violates the Dangerous Organizations and Individuals Community Standard because it shared a symbol of National Party – Greeks, which is a designated hate entity.

This post does not fall under the policy exception, in force in June 2023, as there are no contextual indications that the user intended to feature the logo of National Party – Greeks alongside the name of a lawful party, the Spartans, to “report on, neutrally discuss or condemn” National Party – Greeks or their activities. The majority of the Board distinguish these posts from the content in the Nazi Quote case where contextual cues allowed the Board to conclude that the user’s post neutrally discussed a designated hate entity. In that case, the user referenced a quote from a known historical figure that did not show ideological alignment with the person but attempted to draw “comparisons between the presidency of Donald Trump and the Nazi regime.” No such context is present in this case. Following the December 30, 2023, policy changes, the content in this case would be removed for sharing a reference (symbol) of a designated entity without an explanatory caption.

A minority of the Board consider that this post should not be found to violate the Dangerous Organizations and Individuals policy. They note that simply sharing logos associated with a designated entity, absent other violations or context of harmful intent, should be allowed on the platform.

8.2 Compliance With Meta’s Human Rights Responsibilities

The Board finds that Meta’s decisions to remove the content in both cases were consistent with the company’s human rights responsibilities.

Freedom of Expression (Article 19 ICCPR)

Article 19 (2) of the ICCPR provides for broad protection of expression, including “to seek, receive and impart information and ideas of all kinds.” Protected expression includes “political discourse,” “commentary on public affairs” and expression that may be considered “deeply offensive” ( General Comment No. 34 (2011), para. 11). In an electoral context, the right to freedom of expression also covers access to sources of political commentary, including local and international media, and “access of opposition parties and politicians to media outlets” ( General Comment No. 34 (2011), para. 37).

When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s voluntary human rights commitments, both in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of expression has stated, although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression,” (report A/74/486, para. 41).

I. Legality (Clarity and Accessibility of the Rules)

The principle of legality under international human rights law requires rules that limit expression to be clear and publicly accessible (General Comment No.34, para. 25). Restrictions on expression should be formulated with sufficient precision to enable individuals to regulate their conduct accordingly (Ibid). As applied to Meta, the company should provide guidance to users as to what content is permitted on the platform and what is not. Additionally, rules restricting expression “may not confer unfettered discretion on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not,” ( A/HRC/38/35, para. 46).

For the first case, the Board notes that the examples of “praise” were added to the public facing language of the Dangerous Organizations and Individuals policy in response to the Board’s recommendation no. 2 in the Nazi Quote case. The explicit example of prohibition of “aligning oneself ideologically with a designated entity or event” made Meta’s rule sufficiently clear and accessible for the user in the first case and the content reviewers implementing the rule. The Board notes that this example was removed in the December 2023 update.

In relation to the second case, the Board agrees that Meta’s policy against sharing symbols of designated entities unless the user clearly states their intent to report on, neutrally discuss or condemn designated entities, is sufficiently clear and meets the legality test. The Board further finds that as applied to the second case, the Dangerous Organizations and Individuals policy exception both before and after the August 2023 revisions, meets the legality test.

The Board is, nonetheless, concerned about the lack of transparency around the designation of hate entities and which entities are included under Tier 1 of the Dangerous Organizations and Individuals policy. This makes it challenging for users to understand which entities they are or are not permitted to express ideological alignment with or of those whose symbols they can share.

Tier 1 terrorist organizations include entities and individuals designated by the United States government as Foreign Terrorist Organizations (FTOs) or Specially Designated Global Terrorists (SDGTs), and criminal organizations include those designated by the United States government as Specially Designated Narcotics Trafficking Kingpins (SDNTKs). The U.S. government publishes lists for FTOs, SDGTs and SDNTKs designations, which correspond to at least some of Meta’s Dangerous Organizations and Individuals designations. However, Meta’s full list of Tier 1 “hate entity” designations is not based on an equivalent public U.S. list. The Board has called for Tier 1 entity list transparency in the Nazi Quote case, which Meta declined doing for “safety reasons.”

In response to recommendation no. 1 in the Shared Al Jazeera Post case, following the August 2023 update, the public-facing language on Meta’s Dangerous Organizations and Individuals policy has been supplemented with several examples of the application of the exception. The Board finds that the full scope of the updated exception is not clear to users, as none of the examples illustrates the application of the policy exception in the context of elections. In circumstances of shrinking civic space and threats to media freedom globally, social media platforms serve as an invaluable information source. Given the uncertainty about the scope of the updated policy exception during electoral periods, users in such contexts could be unsure what types of discussion they can engage in on electoral candidates and their supporters, who may also be Tier 1 designated entities.

The Board finds that Meta’s prohibition of “praise” in the form of ideological alignment as well as prohibition of sharing symbols of designated entities as in force in June 2023 met the legality standard. However, the extent of “social and political discourse” about designated entities permitted in the electoral context requires further clarification.

II. Legitimate Aim

Restrictions on freedom of expression must pursue a legitimate aim, which include the protection of the rights of others and the protection of public order and national security.

According to the policy rationale, Meta’s Dangerous Organizations and Individuals policy aims to “prevent and disrupt real-world harm.” In several decisions, the Board has found that Meta’s Dangerous Organizations and Individuals policy pursues the legitimate aim of protecting the rights of others (see Nazi Quote; Mention of the Taliban in News Reporting;Punjabi Concern Over the RSS in India). The Board finds that in these two cases, Meta’s policy pursues a legitimate aim of protecting the rights of others, such as the right to non-discrimination and equality (ICCPR, Articles 2 and 26), the right to life (ICCPR, Article 6), the prohibition of torture, inhuman and degrading treatment (ICCPR, Article 7), and the right to participate in public affairs and the right to vote (ICCPR, Article 25).

III. Necessity and Proportionality

The principle of necessity and proportionality provides that any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected,” (General Comment No. 34, paras. 33-34).

Elections are crucial for democracy, and the Board acknowledges that Meta’s platforms have become a virtually indispensable medium in most parts of the world for political discourse, especially in election periods. Given its close relationship with democracy, political speech “enjoys a heightened level of protection,” (General Comment No. 37, paras. 19 and 32). International freedom of expression mandates noted that “digital media and platforms should make a reasonable effort to adopt measures that allow users to access a diversity of political views and perspectives,” ( Joint Declaration 2022). The UN Special Rapporteur on freedom of peaceful assembly and of association stated that “the freedom of political parties to expression and opinion, particularly through electoral campaigns, including the right to seek, receive and impart information, is as such, essential to the integrity of elections,” ( A/68/299, at para. 38 (2013)).

However, to mitigate adverse human rights impacts, it is crucial to distinguish between protected political speech and political expression that can be restricted because it may further harm. In this regard, as noted by the Board, Meta has a responsibility to identify, prevent, mitigate and account for adverse human rights impacts for use of its platforms (UNGPs, Principle 17).

The UN Special Rapporteur on freedom of peaceful assembly and of association underlined that a political party or any of its candidates can be lawfully prohibited if “they use violence or advocates for violence or national, racial or religious hatred constituting incitement to discrimination, hostility or violence,” (ICCPR, Article 20, ICERD, Article 5). Any restrictions under Article 20, ICCPR, and Article 5, ICERD, must meet the standards of necessity and proportionality under Article 19, para. 3, ICCPR (General Comment 34, para. 50-52; CERD/C/GC/35, para. 24-25).

First Case: An Electoral Candidate’s Campaign Leaflet

The majority of the Board consider that Meta's decision to remove the first post under its Dangerous Organizations and Individuals policy satisfies the principles of necessity and proportionality. The majority acknowledge the importance of freedom of expression during elections, including users’ rights to share and receive information. However, these Board Members find that Meta was justified in removing the post of an electoral candidate expressing ideological alignment with a designated hate figure. This prohibition, coupled with the allowance for users to “report on, neutrally discuss or condemn” designated entities or their activities, including endorsements of this kind during elections, is in line with Meta’s human rights commitments.

In this case, these Board Members understand that removing this post from Meta’s platform did not disproportionately restrict the public’s right to know the information contained therein. Given there were multiple local and regional media reports on the endorsement of the designated entity, convicted for leading a criminal organization connected with hate crimes, the public had other opportunities to learn about this expression of support to the candidate’s party. These media reports would have qualified for the policy exception, which allows for lawful discussion in electoral contexts, without furthering any real-world harm.

Meta’s responsibility to prevent, mitigate and address adverse human rights impacts is heightened in electoral and other high-risk contexts, and requires the company to establish effective guardrails against harm. Meta has a responsibility both to allow political expression and to avoid serious risks to other human rights. Given the potential risk of its platforms being used to incite violence in the context of elections, Meta should continuously ensure the effectiveness of its election integrity efforts (see Brazilian General’s Speech). In view of the multiple elections around the world, Meta’s careful enforcement of the Dangerous Organizations and Individuals policy, especially its updated policy exception in electoral contexts, is imperative.

For some Board Members, a lawful candidate’s post publicizing the support offered by a Tier 1 designated entity is not information about the candidate’s program, but an act of association with a prohibited party. Such publications can be used to circumvent Meta’s prohibition of Tier 1 designated entities from using its services and undermine the democratic process (ICCPR, Article 5). Furthermore, in the present case, where the public had sufficient opportunities to learn about the existing alliances, the removal of the candidate’s post was not disproportionate.

For a minority, removing the content in the first case disproportionately interfered with users’ rights to share and receive information during an election. These Board Members highlight that Meta’s “commitment to expression is paramount” and in this case the company has erred by prioritizing safety over voice. The electorate should have access to information about candidates and their activities, and a party that has been allowed by the Greek Supreme Court to participate in an election should likewise have the widest latitude on what information its candidates can publish. In this case, since the Spartans is a newer party, voters may not yet know much about it.

At the same time, given the reports on decreasing trust towards media in Greece (see section 2 above), voters should have the opportunity to hear directly from lawful candidates. This is especially needed when candidates or their parties receive support or allegiance from entities disqualified from running for elections or those that may be designated under the Dangerous Organizations and Individuals policy.

These Board Members note that a social media platform should not become the arbiter of what voters are and are not allowed to know about a candidate or party. They consider that given the importance of the electoral context, removal of the content in the first case was not the least intrusive means and was a disproportionate restriction of the candidate’s speech and the electorate’s right to access to information. Instead, in line with Meta’s values and human rights commitments, the company should have kept the post up under its newsworthiness allowance. Given the content was an electoral post from a lawful candidate directly informing the electorate about his campaign and the support from Mr. Kasidiaris, published during the elections in Greece, the public’s interest in knowing more about the parties and candidates outweighed the risk of harm.

Second Case: The Logo of National Party – Greeks and the Slogan “Spartans”

In the second case, the majority of the Board find that Meta’s removal of the content in that case was necessary and proportionate as the post shared a symbol of a designated hate entity. In absence of any contextual cues that the content was shared to report on, neutrally discuss or condemn a designated entity, the removal was justified.

A minority of the Board consider that Meta erred in removing this content. This minority note that a contextual analysis is required when determining if the content is harmful. Removal of a post simply sharing a symbol of a designated entity, without any indication of incitement to violence or unlawful action, is disproportionate and cannot be the least intrusive means to protect against harm.

9. Oversight Board Decision

The Oversight Board upholds Meta’s decisions to take down the posts in both cases.

10. Recommendations

Content Policy

1. To provide greater clarity to users, Meta should clarify the scope of the policy exception under the Dangerous Organizations and Individuals Community Standard, which allows for content “reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities” to be shared in the context of “social and political discourse.” Specifically, Meta should clarify how this policy exception relates to election-related content.

The Board will consider this implemented when Meta makes this clarification change in its Community Standard.

*Procedural Note:

The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this decision, independent research was commissioned on behalf of the Board. The Board was assisted by an independent research institute headquartered at the University of Gothenburg, which draws on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology. Memetica, an organization that engages in open-source research on social-media trends, also provided analysis.

Return to Case Decisions and Policy Advisory Opinions