Oversight Board Announces Politician’s Comments on Demographic Changes Case

Today, the Board is announcing a new case for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case Selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta's policies.

The case that we are announcing today is:

Politician’s Comments on Demographic Changes

2023-033-FB-UA

User appeal to remove content from Facebook

Submit public comments, which can be provided anonymously, here.

In July 2023, a user posted a video on Facebook in which French politician Éric Zemmour is interviewed about demographic changes in Europe and Africa. The user who posted the video is an administrator for Zemmour’s official, verified Facebook page, which has about 300,000 followers. A candidate in the 2022 French presidential election, Zemmour won around 7% of the votes in the first round, according to official results, but did not advance any further. He has been found guilty of “ inciting discrimination and religious hatred” in France, a conviction that was upheld by the European Court of Human Rights.

In the video, Zemmour claims the European population has stayed roughly the same since the beginning of the 20th century, while the African population has increased significantly, “so the power balance has shifted.” The caption in French repeats the claims in the video, stating that “when there were four Europeans for one African, [Europe] colonized Africa,” and now “there are four Africans for one European and Africa colonizes Europe.” The content was viewed about 20,000 times and had fewer than 1,000 reactions, the majority of which were “likes,” followed by “love.”

Under its Hate Speech policy, Meta removes direct attacks against people on the basis of protected characteristics, including race, ethnicity, national origin and religious affiliation. Refugees, migrants, immigrants and asylum seekers are protected against “the most severe attacks,” although Meta allows “commentary and criticism of immigration policies.”

The content in this case was reported twice under the Hate Speech policy. Meta’s automated systems closed both reports and the video was left up on Facebook. The first user who reported the content appealed Meta’s decision but following human review on the same day, the company decided it was correct to leave up the video. The same user then appealed Meta’s decision to the Board. In their statement, they described the content as “fake news.” After the Board selected the case, Meta confirmed that its original decision was correct and explained that, in its view, Zemmour’s claims did not violate the Hate Speech policy because they do not contain an attack on a protected group. The company does not consider the claim that one group is “colonizing” a place to be an attack “so long as it does not amount to a call for exclusion.”

The Board selected this case because of the increasing salience of policies toward immigration and migrants in elections around the world, and the attendant rise of anti-migrant content around election periods, including such claims as the “Great Replacement.” The “Great Replacement” is a claim that white European populations are being demographically replaced by non-white peoples. This case falls within the Board’s strategic priorities of Hate Speech Against Marginalized Groups and Elections and Civic Space.

The Board would appreciate public comments that address:

  • Whether the post should be understood as a direct attack on the basis of protected characteristics, in violation of Meta’s Hate Speech policies, or instead as commentary on immigration policy and related social trends.
  • The social and political context of discussions about immigration in France.
  • Views on how Meta’s Hate Speech policies comport with its human rights responsibilities, and whether any changes should be considered.
  • Whether and how the company’s content moderation around its Hate Speech and other applicable policies should be affected by who posts the content, specifically high-profile users such as politicians.
  • Views on how Meta should distinguish “commentary and criticism of immigration policies” from direct attacks on people based on protected characteristics, especially during election periods.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public Comments

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the link above. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23:59 your local time on Tuesday 12 December.

What’s Next

Over the next few weeks, Board members will be deliberating this case. Once they have reached their final decision, we will post it on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.

Return to News