Decisão de múltiplos casos

Cartoons About College Protests

In these two summary decisions, the Board reviewed two posts containing cartoons about college protests.

2 casos incluídos neste pacote

Anulado

FB-TO38JZ4O

Caso sobre organizações e indivíduos perigosos no Facebook

Plataforma
Facebook
विषय
Liberdade de expressão,Política,Protestos
Padrão
Indivíduos e organizações perigosos
Localização
Estados Unidos
Date
Publicado em 12 de Setembro de 2024
Anulado

FB-0F4BU4NY

Caso sobre organizações e indivíduos perigosos no Facebook

Plataforma
Facebook
विषय
Liberdade de expressão,Política,Protestos
Padrão
Indivíduos e organizações perigosos
Localização
Estados Unidos
Date
Publicado em 12 de Setembro de 2024

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

In these two summary decisions, the Board reviewed two posts containing cartoons about college protests. Meta removed both posts on initial review. However, after the Board brought these cases to Meta’s attention for additional review, the company reversed its original decisions and restored both posts.

About the Cases

In the first case, in May 2024 a Facebook user posted a cartoon depicting two people. The first one is wearing a headband with 'Hamas' written on it and holding a weapon. The person is saying that they will kill Jewish people until Israel is "wiped off the map." The second person is dressed in a T-shirt with "college" written on it. That person is holding a book and saying, "It's obvious they just want to live in peace." In the second case, in April 2024 a Facebook user posted a cartoon that shows a family eating together. The son figure looks like Adolf Hitler and wears a shirt emblazoned “I [heart] Hamas.” The father figure expresses concern about how college has changed his son. Both users posted the content in the context of the Israel-Gaza conflict-related protests taking place at universities across the United States.

Meta originally removed both posts from Facebook under its Dangerous Organizations and Individuals (DOI) policy. Under the DOI policy, the company removes "Glorification," "Support," and "Representation" of designated entities, their leaders, founders, or prominent members, and unclear references to them.

In their appeals to the Board, both users stated that they posted a political cartoon that does not violate Meta's Community Standards. After the Board brought these two cases to Meta's attention, the company determined that the posts did not violate its policies and restored both pieces of content to its platform.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Cases

These cases highlight errors in the enforcement of exceptions to Meta’s Dangerous Organizations and Individuals policy that allows content “reporting on, neutrally discussing or condemning dangerous organizations and individuals and their activities,” in order to safeguard a space for “social and political discourse.”

The Board has issued several recommendations to increase transparency around the enforcement of Meta’s Dangerous Organizations and Individuals policy and its exceptions. The Board has also issued recommendations to address enforcement challenges associated with this policy. This includes a recommendation to “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors.” While Meta reported it had implemented this recommendation, it did not publish information to demonstrate this ( Mention of the Taliban in News Reporting, recommendation no. 5).

The Board also recommended that Meta “add criteria and illustrative examples to Meta’s Dangerous Organizations and Individuals policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” a recommendation for which Meta demonstrated implementation through published information ( Shared Al Jazeera Post, recommendation no. 1). Furthermore, in a policy advisory opinion, the Board asked Meta to “explain the methods it uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy,” (Referring to Designated Dangerous Individuals as “Shaheed,” recommendation no. 6). Meta reframed this recommendation. The company shared information about the audits it conducts to assess the accuracy of its content moderation decisions and how this informs areas for improvement. Meta did not, however, explain the methods it uses to perform these assessments, nor has the company committed to share the outcome of such assessments.

Additionally, the Board has issued a recommendation regarding Meta’s enforcement of satirical content. This includes a recommendation for Meta to “make sure it has adequate procedures in place to assess satirical content and relevant context properly. This includes providing content moderators with: (i) access to Facebook’s local operation teams to gather relevant cultural and background information; and (ii) sufficient time to consult with Facebook’s local operation teams and to make the assessment. [Meta] should ensure that its policies for content moderators incentivize further investigation or escalation where a content moderator is not sure if a meme is satirical or not,” a recommendation Meta reported implementation on but did not publish information to demonstrate implementation ( Two Buttons Meme decision, recommendation no. 3). The Board also recommended that Meta include the satire exception to the public language of the Hate Speech Community Standard, a recommendation for which Meta demonstrated implementation through published information, ( Two Buttons Meme decision, recommendation no. 2).

Decision

The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s corrections of its initial errors once the Board brought these cases to Meta’s attention.

Voltar para Decisões de Casos e Pareceres Consultivos sobre Políticas