Oversight Board Upholds Meta’s Decision in Politician’s Comments on Demographic Changes Case
March 12, 2024
The Oversight Board has upheld Meta’s decision to leave up a video clip in which French politician Éric Zemmour discusses demographic changes in Europe and Africa. The content does not violate the Hate Speech Community Standard since there is no direct attack on people based on a protected characteristic such as race, ethnicity or national origin. The majority of the Board find that leaving up the content is consistent with Meta’s human rights responsibilities. However, the Board recommends that Meta should publicly clarify how it distinguishes immigration-related discussions from harmful speech, including hateful conspiracy theories, targeting people based on their migratory status.
About the Case
In July 2023, a video clip in which French politician Éric Zemmour discusses demographic changes in Europe and Africa was posted on his official Facebook page by a user who is the page’s administrator. The clip is part of a longer video interview with the politician. In the video, Zemmour states: “Since the start of the 20th century, there has been a population explosion in Africa.” He goes on to say that while the European population has stayed roughly the same at around 400 million people, the African population has increased to 1.5 billion people, “so the power balance has shifted.” The post’s caption, in French, says that in the 1900s, “when there were four Europeans for one African, [Europe] colonized Africa,” and now “there are four Africans for one European and Africa colonizes Europe.” Zemmour’s Facebook page has about 300,000 followers while this post had been viewed about 40,000 times as of January 2024.
Zemmour has been the subject of multiple legal proceedings, with more than one conviction in France for inciting racial hatred and making racially insulting comments about Muslims, Africans and Black people. He ran for president in 2022 but did not progress beyond the first round. Central to his electoral campaigning is the Great Replacement Theory, which argues that white European populations are being deliberately replaced ethnically and culturally through migration and the growth of minority communities. Linguistic experts note the theory and terms associated with it “incite racism, hatred and violence targeting the immigrants, non-white Europeans and target Muslims specifically.” The video in the post does not specifically mention the theory.
Two users reported the content for violating Meta’s Hate Speech policy but since the reports were not prioritized for review in a 48-hour period, they were both automatically closed. Reports are prioritized by Meta’s automated systems according to the severity of the predicted violation, the content’s virality (number of views) and likelihood of a violation. One of the users then appealed to Meta, which led to one of the company’s human reviewers deciding the content did not violate Meta’s rules. The user then appealed to the Board.
Key Findings
The majority of the Board conclude the content does not violate Meta’s Hate Speech Community Standard. The video clip contains an example of protected (albeit controversial) expression of opinion on immigration and does not contain any call for violence, nor does it direct dehumanizing or hateful language towards vulnerable groups. While Zemmour has been prosecuted for use of hateful language in the past, and themes in this video are similar to the Great Replacement Theory, these facts do not justify removal of a post that does not violate Meta’s standards.
For there to have been a violation, the post would have had to include a “direct attack,” specifically calling for the “exclusion or segregation” of a “protected characteristic” group. Since Zemmour’s comments do not contain any direct attack, and there is neither an explicit call to exclude any group from Europe nor any statement about Africans tantamount to a harmful stereotype, slur or any other direct attack, they do not break Meta’s Hate Speech rules. The policy rationale also makes it clear that Meta allows “commentary on and criticism of immigration policies,” although what is not shared publicly is that the company allows calls for exclusion when immigration policies are being discussed.
However, the Board does find it concerning that Meta does not consider Africans a protected characteristic group, given the fact that national origin, race and religion are protected both under Meta’s policies and international human rights law. Africans are mentioned throughout the content and, in this video, serve as a proxy for non-white Africans.
The Board also considered the relevance of the Dangerous Organizations and Individuals policy to this case. However, the majority find the post does not violate this policy because there are not enough elements to review it as part of a wider Violence-Inducing Conspiracy Network. Meta defines these networks as non-state actors who share the same mission statement, promote unfounded theories claiming that secret plots by powerful actors are behind social and political problems, and who are directly linked to a pattern of offline harm.
A minority of Board Members find that Meta’s approach to content spreading harmful conspiracy theories is inconsistent with the aims of the policies it has designed to prevent an environment of exclusion affecting protected minorities, both online and offline. Under these policies, content involving certain other conspiracy narratives is moderated to protect threatened minority groups. While these Board Members believe that criticism of issues like immigration should be allowed, it is precisely because evidence-based discussion on this topic is so relevant that the spread of such conspiracy theories, such as the Great Replacement Theory, can be harmful. It is not individual pieces of content but the combined effects of such content shared on a large scale and at high speeds that causes the greatest challenge to social media companies. Therefore, Meta needs to reformulate its policies so that its services are not misused by those who promote conspiracy theories causing online and offline harm.
Meta has undertaken research into a policy line that could address hateful conspiracy theories but the company decided this would ultimately lead to removal of too much political speech. The Board is concerned about the lack of information that Meta shared on this process.
The Oversight Board’s Decision
The Oversight Board has upheld Meta’s decision to leave up the post.
The Board recommends that Meta:
- Provide greater detail in the language of its Hate Speech Community Standard on how it distinguishes immigration-related discussions from harmful speech targeting people based on their migratory status. This includes explaining how the company handles content spreading hateful conspiracy theories, so that users can understand how Meta protects political speech on immigration while addressing the potential offline harms of these theories.
For Further Information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click here.