Overturned
Statement About the Chinese Communist Party
A user appealed Meta’s decision to remove an Instagram comment calling for the “death” of the Chinese Communist Party.
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company's attention and include information about Meta's acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta's decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta’s decision to remove an Instagram comment calling for the “death” of the Chinese Communist Party. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
About the Case
In March 2024, an Instagram user posted a comment saying, “Death to the Chinese Communist Party!” followed by skull emojis. This was in response to a post from a news outlet’s account, featuring a video of Wang Wenbin, a former spokesperson for China’s Ministry of Foreign Affairs, condemning the passing of a bill in the United States House of Representatives that could impact TikTok’s presence in the country.
Meta initially removed the user’s post from Facebook under its Violence and Incitement Community Standard, which prohibits “threats of violence.” The company explained that the prohibition includes “certain calls for death if they contain a target and method of violence.”
When the Board brought this case to Meta’s attention, the company determined that removal of the comment was incorrect and it restored the content to Instagram. Meta explained that, as explained in its internal guidelines to content reviewers, calls for death of an institution like the Chinese Communist Party are treated as non-violating.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
This case highlights an inconsistency in how Meta enforces its Violence and Incitement policy against metaphorical or figurative statements in a political context, which can disproportionately impact political speech that is critical of states as well as governmental institutions. The case underlines the importance of Meta taking into consideration the target of the speech (in this case, a political party), as well as people’s use of hyperbolic, rhetorical, ironic and satirical speech to criticize institutions, when designing its moderation systems.
On rhetorical discourse, the Board in the Russian Poem case observed that excerpts with violent language in the poem “Kill him!” may be read as “describing, not encouraging, a state of mind.” The Board determined that the language was employed as a rhetorical device to convey the user’s message and that, as a result, that part of the content was permitted by Meta’s internal guidelines on its Violence and Incitement policy.
Although it addresses a different Community Standard (Hate Speech) from the one at issue in this case (Violence and Incitement), the Myanmar Bot decision is relevant because it also concerns speech directed at states or political institutions. There, the Board concluded that since the profanity in the post did not target people based on race, ethnicity or national origin, but rather a state, it did not violate the Hate Speech Community Standard. The Board emphasized: “It is crucial to ensure that prohibitions on targeting people based on protected characteristics not be construed in a manner that shields governments or institutions from criticism.”
The Board has previously urged Meta to put in place adequate procedures for evaluating content in its relevant context ( “ Two Buttons” Meme, recommendation no. 3). It has also recommended: “To better inform users of the types of statements that are prohibited, Meta should amend the Violence and Incitement Community Standard to (i) explain that rhetorical threats like “death to X” statements are generally permitted, except when the target of the threat is a high-risk person…” ( Iran Protest Slogan, recommendation no. 1); and “Meta should err on the side of issuing scaled allowances where (i) this is not likely to lead to violence; (ii) when potentially violating content is used in protest contexts; and (iii) where public interest is high,” ( Iran Protest Slogan, recommendation no. 2).
Meta reported implementation of the “Two Buttons” Meme recommendation and recommendation no. 2 from the Iran Protest Slogan decision, but did not publish information to demonstrate this. For recommendation no. 1 from Iran Protest Slogan, in its Q4 2023 Quarterly Update on the Board, Meta stated: “We have updated our Violence and Incitement Community Standards by providing further details about what constitutes a ‘threat’ and distinguishing our enforcement based on target. As part of this work, we also updated internal guidance.”
The Board believes that full implementation of these recommendations could contribute to decreasing the number of enforcement errors under the Violence and Incitement policy.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.