Overturned
Federal Constituency in Nigeria
A user appealed Meta’s decision to remove a Facebook post containing an image of Nigerian politician Yusuf Gagdi with a caption referring to a federal constituency in Nigeria.
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.
Case Summary
A user appealed Meta’s decision to remove a Facebook post containing an image of Nigerian politician Yusuf Gagdi with a caption referring to a federal constituency in Nigeria. Removal was apparently based on the fact that the Nigerian constituency goes by the same initials (PKK) that are used to designate a terrorist organization in Turkey, though the two entities are completely unrelated. This case highlights the company’s overenforcement of the Dangerous Organizations and Individuals policy. This can have a negative impact on users’ ability to make and share political commentary, resulting in an infringement of users’ freedom of expression. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Case Description and Background
In July 2023, a Facebook user posted a photograph of Nigerian politician Yusuf Gagdi with the caption “Rt Hon Yusuf Gagdi OON member of the house of reps PKK.” Mr. Gagdi is a representative in the Nigerian Federal House of Representatives from the Pankshin/Kanam/Kanke Federal Constituency in Plateau state. The constituency encompasses three areas, which the user refers to by abbreviating their full names to PKK. However, PKK is also an alias of the Kurdistan Workers’ Party, a designated dangerous organization.
Meta initially removed the post from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company removes content that “praises,” “substantively supports” or “represents” individuals and organizations it designates as dangerous.
In their appeal to the Board, the user stated the post contains a picture of a democratically elected representative of a Nigerian federal constituency presenting a motion in the house, and does not violate Meta's community standards.
After the Board brought this case to Meta’s attention, the company determined that the post’s removal was incorrect because it does not contain any reference to a designated organization or individual, and it restored the content.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.
Case Significance
This case highlights an error in Meta’s enforcement of its Dangerous Organizations and Individuals policy. These errors can result in an infringement of users’ freedom of expression.
The Board has issued several recommendations about the Dangerous Organizations and Individuals policy. This includes a recommendation to “evaluate automated moderation processes for enforcement of the Dangerous Individuals and Organizations policy,” which Meta declined to implement ( Öcalan’s Isolation decision, recommendation no. 2). The Board has also recommended Meta "implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity decision, recommendation no.5). Meta described this recommendation as work it already does but did not publish information to demonstrate implementation.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board also urges Meta to speed up the implementation of still-open recommendations to reduce such errors.