Oversight Board overturns Meta’s original decision in “Brazilian general’s speech” case
The Oversight Board has overturned Meta’s original decision to leave up a Facebook video which features a Brazilian general calling people to “hit the streets,” and “go to the National Congress and the Supreme Court.” Though the Board acknowledges that Meta set up several risk evaluation and mitigation measures during and after the elections, given the potential risk of its platforms being used to incite violence in the context of elections, Meta should continuously increase its efforts to prevent, mitigate and address adverse outcomes. The Board recommends that Meta develop a framework for evaluating its election integrity efforts to prevent its platforms from being used to promote political violence.
About the case
Brazil’s presidential elections in October 2022 were highly polarized, with widespread and coordinated online and offline claims questioning the legitimacy of elections. These included calls for military intervention, and for the invasion of government buildings to stop the transition to a new government. The heightened risk of political violence did not subside with the assumption of office by newly elected President Luiz Inácio Lula da Silva on January 1, 2023, as civil unrest, protests, and encampments in front of military bases were ongoing.
Two days later, on January 3, 2023, a Facebook user posted a video related to the 2022 Brazilian elections. The caption in Portuguese includes a call to “besiege” Brazil’s Congress as “the last alternative.” The video also shows part of a speech given by a prominent Brazilian general who supports the re-election of former President Jair Bolsonaro. In the video, the uniformed general calls for people to “hit the streets” and “go to the National Congress … [and the] Supreme Court.” A sequence of images follows, including one of a fire raging in the Three Powers Plaza in Brasília, which houses Brazil’s presidential offices, Congress, and Supreme Court. Text overlaying the image reads, in Portuguese, “Come to Brasília! Let’s Storm it! Let’s besiege the three powers.” Text overlaying another image reads “we demand the source code,” a slogan that protestors have used to question the reliability of Brazil’s electronic voting machines.
On the day the content was posted, a user reported it for violating Meta’s Violence and Incitement Community Standard, which prohibits calls for forcible entry into high-risk locations. In total, four users reported the content seven times between January 3 and 4. Following the first report, the content was reviewed by a content reviewer and found not to violate Meta’s policies. The user appealed the decision, but it was upheld by a second content reviewer. The next day, the other six reports were reviewed by five different moderators, all of whom found that the content did not violate Meta’s policies.
On January 8, supporters of former president Bolsonaro broke into the National Congress, Supreme Court, and presidential offices located in the “Three Powers Plaza” in Brasília, intimidating the police and destroying property. On January 9, Meta declared the January 8 rioting a “violating event” under its Dangerous Individuals and Organizations policy and said it would remove “content that supports or praises these actions.” The company also announced that it had “designated Brazil as a Temporary High-Risk Location” and had “been removing content calling for people to take up arms or forcibly invade Congress, the Presidential palace and other federal buildings.”
As a result of the Board selecting this case, Meta determined that its repeated decisions to leave the content on Facebook were in error. On January 20, 2023, after the Board shortlisted this case, Meta removed the content.
This case raises concerns around the effectiveness of Meta’s election integrity efforts in the context of Brazil’s 2022 General Election, and elsewhere. While challenging the integrity of elections is generally considered protected speech, in some circumstances widespread claims which attempt to undermine elections can lead to violence. In this case, the speaker’s intent, the content of the speech and its reach, as well as the likelihood of imminent harm resulting in the political context of Brazil at the time, all justified removing the post.
For a post to violate Meta’s rules on calling for forcible entry into high-risk locations, the location must be considered “high-risk,” and it must be situated in an area or vicinity that is separately designated as a “temporary high-risk location.” As the post was an unambiguous call to forcibly enter government buildings situated in the Three Powers Plaza in Brasília (“high-risk locations” situated in a “temporary high-risk location,” Brazil), Meta’s initial decisions to leave this content up during a time of heightened political violence represented a clear departure from its own rules.
The Board is deeply concerned that despite the civil unrest in Brazil at the time the content was posted, and the widespread proliferation of similar content in the weeks and months ahead of the January 8 riots, Meta’s content moderators repeatedly assessed this content as non-violating and failed to escalate it for further review. In addition, when the Board asked Meta for information on specific election-related claims on its platforms before, during, and after the Brazilian elections, the company explained that it does not have data on the prevalence of such claims. The content in this case was finally removed more than two weeks later, by which point the violating event it called had already occurred, and only after the Board brought the case to Meta’s attention.
In response to a question from the Board, Meta stated that it does not adopt any particular metrics for measuring the success of its election integrity efforts generally, beyond reporting data on content takedowns, views and click-throughs on election labels. Therefore, the Board finds that Meta should develop a framework for evaluating the company’s election integrity efforts, and for public reporting on the subject. This aims to provide the company with relevant data to improve its content moderation system as a whole and to decide how best to employ its resources in electoral contexts. Without this kind of information, neither the Board nor the public can evaluate the effectiveness of Meta’s election integrity efforts more broadly.
The Oversight Board’s decision
The Oversight Board overturns Meta’s original decision to leave up the post.
The Board also recommends that Meta:
- Develop a framework for evaluating its election integrity efforts. This includes creating and sharing metrics for successful election integrity efforts, including those related to Meta’s enforcement of its content policies and its approach to ads.
- Clarify in its Transparency Center that, in addition to the Crisis Policy Protocol, the company runs other protocols in its attempt to prevent and address potential risk of harm arising in electoral contexts or other high-risk events.
For further information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.