सार्वजनिक टिप्पणियाँ पोर्टल

Content Targeting Human Rights Defender in Peru

अंतिम तारीख: 23:59 PST, 28 जनवरी 2025

स्वीकृत भाषाएँ:Spanish, English

14 जनवरी 2025 केस चयनित
14 जनवरी 2025 सार्वजनिक टिप्पणियाँ खुलीं
आगामी फ़ैसला प्रकाशित किया गया
आगामी मेटा निर्णय लागू करता है

केस विवरण

In July 2024, a Facebook user in Peru posted a digitally altered headshot of the leader of a human rights organization in Peru. The image of the human rights defender appears to be AI-manipulated, showing their face covered with blood that is dripping downward. A caption in Spanish insinuates financial wrongdoing by non-governmental organizations (NGOs) and accuses NGOs of encouraging violent protests. The post was shared around the time of demonstrations in Peru’s capital when citizens protested against the government. It was viewed around 1,000 times and had less than 100 reactions.

Three days after the content was posted, a user reported it for violating Meta’s Community Standards. A human reviewer determined the content did not violate Meta’s policies and the post was kept up on the platform. The user appealed Meta’s decision, but that appeal was automatically closed without further human review.

The user who reported the post then appealed to the Board, stating that the image was a “thinly-veiled death threat” against a human rights defender. They added that the post should be interpreted within a broader context of “harassment and physical attacks” against human rights defenders in Peru, and that it was shared in response to the July 2024 demonstrations. They explained that the user who posted the content is a member of “La Resistencia,” a group known for inciting violence against human rights defenders and journalists in Peru, and that such online threats have escalated into offline violence.

In the time between the user appealing to the Board and the Board selecting this case, the post was also reported to Meta through its Trusted Partner program. Meta’s Trusted Partner program is a network of non-governmental organizations, humanitarian agencies and human rights researchers from 113 countries. In connection with this report, Meta’s internal escalation teams reviewed the account associated with the post and found it was in violation of Meta’s Terms of Service for reasons unrelated to the content. Meta then disabled the account, making the content inaccessible on Facebook. The content was not assessed further at that time as the user’s account had been disabled.

When the Board selected this case, Meta’s policy subject matter experts reviewed the post again, confirming the original decision that the content did not violate the Community Standards, including Violence and Incitement and Bullying and Harassment. However, Meta noted that it did not reach out to a broad cross-functional team or external parties for additional input to inform its decision, as it might have done otherwise to assess the content as a veiled threat, had the content remained live on its platforms. Threats that are “veiled or implicit” require “additional information and/or context to enforce,” according to the Violence and Incitement Community Standard.

The Board selected this case to examine how Meta enforces policies that aim to protect human rights defenders, particularly when threats of violence are veiled or implicit, require additional context to interpret, or occur within an environment of intimidation and harassment. This case falls within the Board’s strategic priority of Elections and Civic Space.

The Board would appreciate public comments that address:

  • The sociopolitical context in Peru, in particular risks to the safety and freedom of expression of human rights defenders, journalists and civil society organizations.
  • Recent laws and bills in Peru and elsewhere in the region that limit or undermine spaces for expression, assembly and political participation of civil society organizations.
  • The use of social media to spread narratives accusing NGOs of wrongdoing and if this type of content has been associated with explicit or implicit (coded) calls for offline violence.
  • Policy recommendations for the protection of human rights defenders that have already been made to social media platforms, as well as the outcomes of campaigns to implement those recommendations.
  • The use of images, including digitally altered or AI-manipulated, to harass, intimidate and make threats of violence against activists and journalists.
  • Moderation of veiled or implicit threats of violence that require additional context to interpret. This can include the impact of error rates in removing or failing to remove content containing veiled or implicit threats on freedom of expression and other human rights.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public Comments

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Tuesday 28 January.

What’s Next

Over the next few weeks, Board Members will be deliberating this case. Once they have reached their decision, we will post it on the Decisions page.