Overturned
Lebanese activist
September 13, 2023
A user appealed Meta’s decision to remove an Instagram post of an interview where an activist discusses Hassan Nasrallah, the Secretary General of Hezbollah. This case highlights the over-enforcement of Meta’s Dangerous Organizations and Individuals policy. This can have a negative impact on users’ ability to share political commentary and news reporting, resulting in an infringement of users’ freedom of expression. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
This is a summary decision. Summary decisions examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not consider public comments, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas of potential improvement in its policy enforcement.
Case summary
A user appealed Meta’s decision to remove an Instagram post of an interview where an activist discusses Hassan Nasrallah, the Secretary General of Hezbollah. This case highlights the over-enforcement of Meta’s Dangerous Organizations and Individuals policy. This can have a negative impact on users’ ability to share political commentary and news reporting, resulting in an infringement of users’ freedom of expression. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Case description and background
On January 2023, the verified account of a Lebanese activist posted a video of himself being interviewed by a news anchor in Arabic. The news anchor begins by jokingly asking the activist whether a professional soccer player, or Hassan Nasrallah, the Secretary General of Hezbollah, is more useful. The activist responds by praising the soccer player and criticizing Nasrallah. The activist highlights the plane hijackings and kidnappings conducted by Hezbollah, along with Nasrallah’s support of former Lebanese politicians Nabih Berri and Michel Aoun—both of whom the activist claims were unwanted by the Lebanese people. Throughout the interview, video clips of Nasrallah play on mute. The caption the activist added continues this comparison, joking: “Let’s see how many goals Nasrallah can score first.” The post received 137,414 views and was reported to the Board 11 times.
Meta initially removed the post from Instagram under its Dangerous Organizations and Individuals policy. In his appeal to the Board, the user claimed that Hezbollah uses coordinated reporting to remove content that criticizes the organization. The Board has not independently verified the claim that coordinated reporting was responsible for the removal of this content, or for any of the reports relating to the content. The user claimed that “Instagram’s community guidelines are being used to extend Hezbollah’s oppression against peaceful citizens like me.”
After the Board brought this case to Meta’s attention, the company determined that its removal was incorrect and restored the content to Instagram. The company acknowledged that while Hassan Nasrallah is designated as a dangerous individual, Meta lets users criticize or neutrally report on the actions of a dangerous organization or individual. Specifically, Meta allows “[an] expression of a negative perspective about a designated entity or individual,” including, “disapproval, disgust, rejection, criticism, mockery etc.” Meta acknowledged that the video was posted with a satirical and condemning caption, making the content non-violating.
Board authority and scope
The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors, and to increase the fair treatment of Facebook and Instagram users.
Case significance
This case highlights over-enforcement of Meta’s Dangerous Organizations and Individuals prohibition on “praise,” which can have a negative impact on users’ capacity to share political commentary and news reporting on Meta’s platforms.
The Board has issued recommendations relating to the Dangerous Organizations and Individuals policy’s prohibition on praise of designated entities. This includes a recommendation to create a “reporting” allowance which would allow for positive statements about dangerous organizations and individuals in news reporting, which Meta committed to implement ( Mention of the Taliban in news reporting, recommendation no. 4); and to assess the accuracy of the “reporting” allowance to identify systemic issues causing enforcement errors, a recommendation which Meta is still assessing ( Mention of the Taliban in news reporting, recommendation no. 5).
Furthermore, the Board has issued recommendations on clarifying the policy for users. This includes a recommendation to add criteria and illustrative examples to Meta’s DOI policy to increase understanding of exceptions, specifically around neutral discussion and news reporting—a recommendation Meta is still assessing ( Shared Al Jazeera post, recommendation no. 1).
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.