A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board overturns original Facebook decision: Case 2021-006-IG-UA


July 2021

The Oversight Board has overturned Facebook’s original decision to remove an Instagram post encouraging people to discuss the solitary confinement of Abdullah Öcalan, a founding member of the Kurdistan Workers’ Party (PKK). After the user appealed and the Board selected the case for review, Facebook concluded that the content was removed in error and restored it. The Board is concerned that Facebook misplaced an internal policy exception for three years and that this may have led to many other posts being wrongly removed.

About the case

This case relates to Abdullah Öcalan, a founding member of the PKK. This group has used violence in seeking to achieve its aim of establishing an independent Kurdish state. Both the PKK and Öcalan are designated as dangerous entities under Facebook’s Dangerous Individuals and Organizations policy.

On January 25, 2021, an Instagram user in the United States posted a picture of Öcalan which included the words “y’all ready for this conversation” in English. In a caption, the user wrote that it was time to talk about ending Öcalan’s isolation in prison on Imrali island in Turkey. The user encouraged readers to engage in conversation about Öcalan’s imprisonment and the inhumane nature of solitary confinement.

After being assessed by a moderator, the post was removed on February 12 under Facebook’s rules on Dangerous Individuals and Organizations as a call to action to support Öcalan and the PKK. When the user appealed this decision, they were told their appeal could not be reviewed because of a temporary reduction in Facebook’s review capacity due to COVID-19. However, a second moderator did carry out a review of the content and found that it violated the same policy. The user then appealed to the Oversight Board.

After the Board selected this case and assigned it to panel, Facebook found that a piece of internal guidance on the Dangerous Individuals and Organizations policy was “inadvertently not transferred” to a new review system in 2018. This guidance, developed in 2017 partly in response to concern about the conditions of Öcalan’s imprisonment, allows discussion on the conditions of confinement for individuals designated as dangerous.

In line with this guidance, Facebook restored the content to Instagram on April 23. Facebook told the Board that it is currently working on an update to its policies to allow users to discuss the human rights of designated dangerous individuals. The company asked the Board to provide insight and guidance on how to improve these policies. While Facebook updated its Community Standard on Dangerous Individuals and Organizations on June 23, 2021, these changes do not directly impact the guidance the company requested from the Board.

Key findings

The Board found that Facebook’s original decision to remove the content was not in line with the company’s Community Standards. As the misplaced internal guidance specifies that users can discuss the conditions of confinement of an individual who has been designated as dangerous, the post was permitted under Facebook’s rules.

The Board is concerned that Facebook lost specific guidance on an important policy exception for three years. Facebook’s policy of defaulting towards removing content showing “support” for designated individuals, while keeping key exceptions hidden from the public, allowed this mistake to go unnoticed for an extended period. Facebook only learned that this policy was not being applied because of the user who decided to appeal the company’s decision to the Board.

While Facebook told the Board that it is conducting a review of how it failed to transfer this guidance to its new review system, it also stated “it is not technically feasible to determine how many pieces of content were removed when this policy guidance was not available to reviewers.” The Board believes that Facebook’s mistake may have led to many other posts being wrongly removed and that Facebook’s transparency reporting is not sufficient to assess whether this type of error reflects a systemic problem. Facebook’s actions in this case indicate that the company is failing to respect the right to remedy, contravening its Corporate Human Rights Policy (Section 3).

Even without the discovery of the misplaced guidance, the content should never have been removed. The user did not advocate violence in their post and did not express support for Öcalan’s ideology or the PKK. Instead, they sought to highlight human rights concerns about Öcalan’s prolonged solitary confinement which have also been raised by international bodies. As the post was unlikely to result in harm, its removal was not necessary or proportionate under international human rights standards.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s original decision to remove the content. The Board notes that Facebook has already restored the content.

In a policy advisory statement, the Board recommends that Facebook:

  • Immediately restore the misplaced 2017 guidance to the Internal Implementation Standards and Known Questions (the internal guidance for content moderators).
  • Evaluate automated moderation processes for enforcement of the Dangerous Individuals and Organizations policy. Where necessary, Facebook should update classifiers to exclude training data from prior enforcement errors that resulted from failures to apply the 2017 guidance.
  • Publish the results of the ongoing review process to determine if any other policies were lost, including descriptions of all lost policies, the period they were lost for, and steps taken to restore them.
  • Ensure the Dangerous Individuals and Organizations “policy rationale” reflects that respect for human rights and freedom of expression can advance the value of “Safety.” The policy rationale should specify in greater detail the “real-world harms” the policy seeks to prevent and disrupt when “Voice” is suppressed.
  • Add to the policy a clear explanation of what “support” excludes. Users should be free to discuss alleged abuses of the human rights of members of designated organizations.
  • Explain in the Community Standards how users can make the intent behind their posts clear to Facebook.
  • Ensure meaningful stakeholder engagement on the proposed changes to its Dangerous Individuals and Organizations policy through Facebook’s Product Policy Forum, including through a public call for inputs.
  • Ensure internal guidance and training is provided to content moderators on any proposed policy changes.
  • Ensure that users are notified when their content is removed. The notification should note whether the removal is due to a government request or due to a violation of the Community Standards, or due to a government claiming a national law has been violated (and the jurisdictional reach of any removal).
  • Clarify to Instagram users that Facebook’s Community Standards apply to Instagram in the same way they apply to Facebook.
  • Include information in its transparency reporting on the number of requests received for content removals from governments based on Community Standards violations (as opposed to violations of national law), and the outcomes of those requests.
  • Include more comprehensive information in its transparency reporting on error rates for enforcing rules on “praise” and “support” of dangerous individuals and organizations, broken down by region and language.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Back to news and articles