Upheld
COVID lockdowns in Brazil
August 19, 2021
The Oversight Board has upheld Facebook's decision to leave up a post by a state-level medical council in Brazil, which claimed that lockdowns are ineffective and had been condemned by the World Health Organization (WHO).
Case summary
The Oversight Board has upheld Facebook’s decision to leave up a post by a state-level medical council in Brazil which claimed that lockdowns are ineffective and had been condemned by the World Health Organization (WHO).
The Board found that Facebook’s decision to keep the content on the platform was consistent with its content policies. The Board found that the content contained some inaccurate information which raises concerns considering the severity of the pandemic in Brazil and the council’s status as a public institution. However, the Board found that the content did not create a risk of imminent harm and should, therefore, stay on the platform. Finally, the Board emphasized the importance of measures other than removal to counter the spread of COVID-19 misinformation to be adopted under certain circumstances, such as those in this case.
About the case
In March 2021, the Facebook page of a state-level medical council in Brazil posted a picture of a written notice on measures to reduce the spread of COVID-19, entitled “Public note against lockdown.”
The notice claims that lockdowns are ineffective, against fundamental rights in the Constitution and condemned by the WHO. It includes an alleged quote from Dr. David Nabarro, a WHO special envoy for COVID-19, stating that "the lockdown does not save lives and makes poor people much poorer." The notice claims that the Brazilian state of Amazonas had an increase in deaths and hospital admissions after lockdown as evidence of the failure of lockdown restrictions. The notice claims that lockdowns would lead to greater mental disorders, alcohol and drug abuse, and economic damage, amongst other things. It concludes that effective preventative measures against COVID-19 include education campaigns about hygiene, masks, social distancing, vaccination and government monitoring – but never lockdowns.
The page has over 10,000 followers. The content was viewed around 32,000 times and shared around 270 times. No users reported the content. Facebook took no action against the content and referred the case to the Board. The content remains on the platform.
Key findings
The Board concluded that Facebook’s decision to keep the content on the platform was consistent with its content policies. The Violence and Incitement Community Standard prohibits content which contains misinformation that contributes to the risk of imminent violence or physical harm. The Help Center article linked from the Standard states that Facebook determines if information is false based on the opinion of public health authorities. The Board found that the content contained some inaccurate information which raises concerns considering the severity of the pandemic in Brazil and the council’s status as a public institution. However, the Board found that the content did not create a risk of imminent harm.
The statement that the WHO condemned lockdowns and the quote attributed to Dr. David Nabarro are not fully accurate. Dr. Nabarro did not say that “lockdown does not save lives,” but instead noted that the WHO did “not advocate lockdowns as a primary means of control of this virus” and that they have the consequence of “making poor people an awful lot poorer.” The WHO has said that “lockdowns are not sustainable solutions because of their significant economic, social broader health impacts. However, during the #COVID19 pandemic there’ve been times when restrictions were necessary and there may be other times in the future.”
The Board notes Facebook’s argument that the threshold of “imminent harm” was not met because the WHO and other health experts advised the company to “remove claims advocating against specific health practices, such as social distancing,” but not claims advocating against lockdowns. Despite confirming that it has been in communication with Brazil’s national public health authority, Facebook said it does not take into account local context when defining the threshold of imminent harm for enforcement of the policy on misinformation and harm.
The Board believes that Facebook should take into consideration local context when assessing the risk of imminent physical harm and the fact that the content was shared by a public institution, which has a duty to provide reliable information. However, the Board still finds that the post does not meet the threshold of imminent harm in this case, despite the severity of the pandemic in Brazil, because the post emphasized the importance of other measures to counter the spread of COVID-19 – including social distancing.
Facebook disclosed that the post was eligible for fact-checking, but that fact-checking partners did not assess this content. The Board notes that Facebook’s approach failed to provide additional context to content that may endanger people’s trust in public information about COVID-19, and that Facebook should prioritize sending potential health misinformation from public authorities to fact-checking partners.
The Board notes that Facebook has previously stated that content from politicians is not eligible for fact-checking, but its policies do not make clear eligibility criteria for other users, such as pages or accounts administered by public institutions.
The Oversight Board’s decision
The Oversight Board upholds Facebook's decision to keep the content on the platform.
In a policy advisory statement, the Board recommends that Facebook:
- Implement the Board’s recommendation from case decision 2020-006-FB-FBR for Facebook to adopt less intrusive measures where content related to COVID-19 distorts the advice of international health authorities and where a potential for physical harm is identified but is not imminent.
- Prioritize the fact-checking of content flagged as health misinformation, taking into consideration the local context.
- Provide more transparency within the False News Community Standard regarding when content is eligible for fact-checking, including whether public institutions' accounts are subject to fact-checking.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
1. Decision summary
The Oversight Board has upheld Facebook’s decision to leave up a post by a state-level medical council in Brazil which claimed that lockdowns are ineffective and had been condemned by the World Health Organization (WHO). As such, the content will remain on Facebook.
2. Case description
In March 2021, the Facebook page of a state-level medical council in Brazil posted a picture of a written notice with messaging in Portuguese on measures to reduce the spread of COVID-19, entitled “Public note against lockdown.” The notice claims that lockdowns are ineffective, against the fundamental rights in the Constitution and condemned by the World Health Organization (WHO). It includes an alleged quote from Dr. David Nabarro, one of the WHO’s special envoys for COVID-19, stating that "the lockdown does not save lives and makes poor people much poorer." The notice also claims that the Brazilian state of Amazonas had an increase in the number of deaths and hospital admissions after lockdown as evidence of the failure of lockdown restrictions. The notice claims that lockdowns would lead to an increase in mental disorders, alcohol and drug abuse, and economic damage, amongst other things. It concludes that effective preventative measures against COVID-19 include education campaigns about hygiene measures, the use of masks, social distancing, vaccination and extensive monitoring by the government – but never the decision to adopt lockdowns.
The page has more than 10,000 followers. The content was viewed around 32,000 times and shared around 270 times. No users reported the content. Facebook took no action against the content and referred the case to the Board. The content remains on the platform.
The following factual background is relevant to the Board’s decision. Article 1 of Brazil’s Federal Law No. 3268/1957 outlines that medical councils are part of the government administration of each of the 26 states, endowed with legal personality under public law as well as having administrative and financial autonomy. The councils are responsible for the professional registration of medical doctors and their titles. Article 2 notes that they are supervisory bodies of professional ethics and have sanctioning powers over physicians. Medical councils do not have authority to impose measures such as lockdowns under Federal Law No. 3268/1957.
The claims made in the post that the WHO condemned lockdowns and Dr. David Nabarro said that “lockdown does not save lives” are not fully accurate. Dr. Nabarro noted that lockdowns have the consequence of “making poor people an awful lot poorer” but he did not say that they “do not save lives.” The WHO has not condemned lockdowns, it has said that lockdowns are not a sustainable solution due to their significant economic, social and broader health impacts, but there may be times when such restrictions are necessary, and are best used to prepare for longer-term public health measures.
The lockdown in Amazonas referred to in the notice shared by the medical council was adopted between January 25 and January 31, 2021, by Decree No. 43,303 of January 23, 2021, and extended by Decree No. 43,348 of January 31, 2021, until February 7, 2021. The Decrees established temporary restrictions on the movement of people in public venues and suspended the operation of all commercial activities and services with a few exceptions – including the transportation of essential goods, the operation of markets, bakeries, drug stores, gas stations, banks and health care units, among others. The lockdown measures were enforced by the police and other authorities. Those not abiding by the Decrees could face a number of sanctions.
3. Authority and scope
The Oversight Board has the power to review a broad set of questions referred by Facebook (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). Decisions on these questions are binding and may include policy advisory statements with recommendations. These recommendations are non-binding but Facebook must respond to them (Charter Article 3, Section 4).
4. Relevant standards
The Oversight Board considered the following standards in its decision:
I. Facebook’s Community Standards:
The introduction to the Community Standards contains a section titled “COVID-19: Community Standards Updates and Protections.” The full text states:
As people around the world confront this unprecedented public health emergency, we want to make sure that our Community Standards protect people from harmful content and new types of abuse related to COVID-19. We're working to remove content that has the potential to contribute to real-world harm, including through our policies prohibiting the coordination of harm, the sale of medical masks and related goods, hate speech, bullying and harassment, and misinformation that contributes to the risk of imminent violence or physical harm.
As the situation evolves, we are continuing to look at content on the platform, assess speech trends and engage with experts, and will provide additional policy guidance when appropriate to keep the members of our community safe during this crisis. [emphasis added]
The Violence and Incitement Community Standard states that Facebook prohibits content containing "Misinformation and unverifiable rumors that contribute to the risk of imminent violence or physical harm.” It then states: “Additionally, we have specific rules and guidance regarding content related to COVID-19 and vaccines. To see these specific rules, please click here."
According to the article provided in the link above, under this policy Facebook removes content discouraging good health practices that “public health authorities advise people take to protect themselves from getting or spreading COVID-19,” including “wearing a face mask, social distancing, getting tested for COVID-19 and […] getting vaccinated against COVID-19.”
The policy rationale for Facebook’s False News Community Standard states that:
Reducing the spread of false news on Facebook is a responsibility that we take seriously. We also recognize that this is a challenging and sensitive issue. We want to help people stay informed without stifling productive public discourse. There is also a fine line between false news and satire or opinion. For these reasons, we don't remove false news from Facebook, but instead significantly reduce its distribution by showing it lower in the News Feed.
The False News Standard provides information on the range of enforcement options used by Facebook besides content removal:
We are working to build a more informed community and reduce the spread of false news in a number of different ways, namely by:
- Disrupting economic incentives for people, Pages, and domains that propagate misinformation.
- Using various signals, including feedback from our community, to inform a machine learning model that predicts which stories may be false.
- Reducing the distribution of content rated false by independent fact-checkers.
- Empowering people to decide for themselves what to read, trust or share by informing them with more context and promoting news literacy.
- Collaborating with academics and other organizations to help solve this challenging issue.
II. Facebook’s values:
Facebook’s values are described in the introduction to the Community Standards. “Voice” is described as Facebook’s paramount value:
The goal of our Community Standards has always been to create a place for expression and give people a voice. This has not and will not change. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.
Facebook notes that “Voice” may be limited in service of four other values – the relevant one in this case is “Safety”:
We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.
III. Human rights standards:
The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In March 2021, Facebook announced its Corporate Human Rights Policy, where it recommitted to respecting human rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:
- The right to freedom of opinion and expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression, reports: A/HRC/38/35 (2018), A/74/486 (2019), A/HRC/44/49 (2020), A/HRC/47/25 (2021); Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda FOM.GAL/3/17(2017).
- The right to health: Article 12, International Covenant on Economic, Social and Cultural Rights ( ICESCR); General Comment No. 14, the Committee on Economic, Social and Cultural Rights, (2000).
- The right to life: Article 6, ICCPR.
5. User statement
Facebook referred this case to the Oversight Board. Facebook confirmed to the Oversight Board that it sent the user a notification that the case had been referred to the Board and provided the user the opportunity to submit information on this case, but the user did not submit a statement.
The Board notes that the notification sent by Facebook provides the user with the opportunity to submit information. The Board is concerned, however, that Facebook does not provide the user with sufficient information to be able to properly provide a statement. The notifications shown by Facebook to the user states the general topics that the case relates to, but does not provide a detailed explanation of why the content was referred to the Board and the relevant policies the content might be enforced against.
6. Explanation of Facebook’s decision
Facebook took no action against the content and stated in its referral to the Board that the case is "difficult because this content does not violate Facebook's policies, but can still be read by some people as advocacy for taking certain safety measures during the pandemic." It explained that “an internal team at Facebook familiar with the region noted reports from the press about the case content and flagged the case for review. The reviewers determined that the content did not violate Facebook’s policies.”
Facebook says that it prohibits misinformation that may “contribute to the risk of imminent violence or physical harm,” and that it consults with the WHO, the U.S. Centers for Disease Control and Prevention, and other leading public health authorities in order to determine whether a particular false claim about COVID-19 may contribute to the risk of imminent physical harm. Facebook says that the content in this case does not meet that standard. It says that “the WHO does not state that criticizing lockdown measures may contribute to the risk of imminent physical harm” and that "while the World Health Organization and other health experts have advised Facebook to remove claims advocating against specific health practices, such as social distancing, they have not advised Facebook to remove claims advocating against lockdowns."
In response to a question from the Board on how Facebook defines the line between lockdowns and social distancing measures, Facebook stated that “the WHO defines “lockdowns” as large scale physical distancing measures and movement restrictions put in place by the government. Social distancing, on the other hand, is the practice of an individual keeping a certain amount of physical distance from another person. A lockdown can, in theory, include social distancing as a requirement.”
Facebook also noted that “in this case, the post was eligible to be rated by our third party fact-checkers, but the fact checkers did not rate this content. [sic] and it was not downranked or labeled as false news.” Facebook stated that its fact-checking partners are independent and it “does not speculate on why they rate or do not rate eligible posts, including this one.”
Facebook says that it does not take a different approach to the threshold for health misinformation depending on the context in different countries – its policies are global in scope. It states that it consults with leading public health authorities in developing its policies, and confirmed in its responses to the Board’s questions that it has been in communication with the national public health authority in Brazil.
7. Third-party submissions
The Oversight Board received 30 public comments on this case. Three comments were submitted from Asia Pacific and Oceania, one from Central and South Asia, nine from Latin America and the Caribbean, and 17 from the US and Canada.
A range of organizations and individuals submitted comments, including a number of researchers and organizations in Brazil. The submissions covered the following themes: the importance of considering the Brazilian context, including the impact of COVID-19 and the political context; discussion and analysis of the impact of alternative enforcement measures such as labeling and downranking; and the influential nature of the user as a medical authority.
Comments providing more context on the situation in Brazil noted the politicization of the health emergency in Brazil (PC-10105), that adherence to evidence-based public policy measures combatting COVID-19 had been affected by political forces in Brazil contesting such measures (PC-10100) and that due to a context in which “lockdown” had become a political buzzword, claims advocating against lockdowns could also encourage defiance of other safety measures (PC-10106). Researchers focused on disinformation in Brazil also found that public authorities have a much higher impact when sharing disinformation (PC-10104).
To read public comments submitted for this case, please click here.
8. Oversight Board analysis
8.1 Compliance with Community Standards
The Board concludes that Facebook’s decision to keep the content on the platform was consistent with its content policies. The Violence and Incitement Community Standard prohibits content which contains misinformation that contributes to the risk of imminent violence or physical harm. The Help Center article linked from the Violence and Incitement Community Standard states that Facebook removes false content under this policy based on previous guidance from public health authorities. Although the Board finds that the content contained some misinformation (see below), the content did not create a risk of imminent harm.
The post claims that lockdowns are ineffective and condemned by the WHO, and includes an alleged quote from WHO official Dr. David Nabarro saying that "the lockdown does not save lives and makes poor people much poorer." This information is not fully accurate. The part of the quote from WHO official Dr. David Nabarro stating that “lockdown does not save lives” is inaccurate – Dr. Nabarro stated that the WHO did “not advocate lockdowns as a primary means of control of this virus” and that they have the consequence of “making poor people an awful lot poorer,” but he did not say that “lockdown does not save lives.” The WHO has said that “lockdowns are not sustainable solutions because of their significant economic, social broader health impacts. However, during the #COVID19 pandemic there’ve been times when restrictions were necessary and there may be other times in the future. ... Because of their severe economic, social broader health impacts, lockdowns need to be limited in duration. They’re best used to prepare for longer-term public health measures. During these periods, countries are encouraged to lay the groundwork for more sustainable solutions.”
The Board notes Facebook’s argument that the threshold of “imminent harm” was not met because the World Health Organization and “other health experts” advised the company to “remove claims advocating against specific health practices, such as social distancing,” but not claims advocating against lockdowns. Despite confirming that it has been in communication with “the national public health authority in Brazil,” Facebook highlighted that it does not take into account local context when defining the threshold of “imminent harm” for the enforcement of the policy on misinformation and harm.
The Board believes, however, that Facebook should take into consideration local context and consider the current situation in Brazil when assessing the risk of imminent physical harm. As highlighted by the experts consulted by the Board, as well as several public comments submitted by organizations and researchers in Brazil, the COVID-19 pandemic has already resulted in more than 500,000 deaths in the country, one of the worst rates of deaths per million inhabitants of any country. The experts consulted and some public comments also emphasized the politicization of measures to counter the spread of COVID-19 in the country.
In light of the situation and context in Brazil, the Board is concerned that the spread of COVID-19 misinformation in the country can endanger people’s trust in public information about appropriate measures to counter the pandemic, which could increase the risk of users adopting risky behaviors. The Board understands that this would justify a more nuanced approach by Facebook in the country, intensifying its efforts to counter misinformation there, as the Board advocates under Recommendation 2 below. However, the Board still finds that the post does not meet the threshold of imminent harm, because it discusses a measure that is not suggested unconditionally by the public health authorities and emphasizes the importance of other measures to counter the spread of COVID-19 – including social distancing.
In its responses to questions from the Board in this case, Facebook disclosed that the post was eligible for fact-checking under the False News Community Standard, but that fact-checking partners did not assess this content. The Board understands these partners may not be able to analyze all content flagged as misinformation by Facebook’s automated systems, internal teams or users. However, the Board notes that Facebook’s approach to misinformation failed to provide additional context to a piece of content that may endanger people’s trust in public information about COVID-19 and may undermine the effectiveness of measures that in certain cases can be essential. Facebook should prioritize sending content which comes to its attention and that appears to contain health misinformation shared by public authorities to fact-checking partners, especially during the pandemic. The Board has issued a recommendation in this regard in section 10. The Board also notes that Facebook has previously stated that “opinion and speech” from politicians are not eligible for fact-checking, but its policies do not make clear eligibility criteria for other users, such as pages or accounts administered by state and public institutions. The Board notes that content shared by state and public institutions should be eligible for fact-checking.
8.2 Compliance with Facebook’s values
The Board found that Facebook’s decision to take no action against this content was consistent with its value of “Voice.” Although Facebook’s value of “Safety” is important, particularly in the context of the pandemic, this content did not pose an imminent danger to the value of “Safety” to justify displacing “Voice.”
8.3 Compliance with Facebook’s human rights responsibilities
Freedom of expression (Article 19 ICCPR)
Article 19 para. 2 of the ICCPR provides broad protection for expression of "all kinds." The UN Human Rights Committee has highlighted that the value of expression is particularly high when it involves public institutions or discusses matters of public concern (General comment No. 34, paras. 13, 20 and 38). As an institution established by law, the medical council is a public institution which has human rights duties, including the duty to ensure that it disseminates reliable and trustworthy information about matters of public interest (A/HRC/44/49, para. 44).
The Board notes that even though the medical councils do not have authority to impose measures such as lockdowns, it is relevant that they are part of the state government administration and may exert influence over the authorities deciding on the adoption of measures to counter the spread of COVID-19.
The Board notes that the post engages with a wider and important discussion in Brazil about appropriate measures to counter the spread of COVID-19 in the country. Moreover, because the post was shared by the Facebook page of a medical council in Brazil there is general increased interest in its views as an institution on public health issues. The Board recognizes the importance of professional experts to state their views in matters of forming public health policies.
The right to freedom of expression is fundamental and includes the right to receive information, including from governmental entities – however, this right is not absolute. Where restrictions are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook has recognized its responsibilities to respect international human rights standards under the UNGPs. Relying on the UNGPs framework, the UN Special Rapporteur on freedom of opinion and expression has called on social media companies to ensure their content rules are guided by the requirements of Article 19, para. 3, ICCPR (on content rules addressing disinformation, see: A/HRC/47/25, at para. 96; on content rules more broadly, see: A/HRC/38/35, paras 45 and 70). The Board examined whether the removal of the post would be justified under this three-part test in accordance with Facebook’s human rights responsibilities.
I. Legality (clarity and accessibility of the rules)
Article 19, para. 3, ICCPR requires any rules a state imposes to restrict expression to be clear, precise and publicly accessible (General comment 34, para. 25). People should have enough information to determine if and how their access to information may be limited. To protect these rights, it is also important that public bodies are able to clearly understand the rules that apply to their communications on the platform and adjust their behavior accordingly. General Comment 34 also highlights that the rules imposed “may not confer unfettered discretion for the restriction of freedom of expression on those charged with its execution” (para. 25). Facebook also has a responsibility to ensure its rules comply with the principle of legality (A/HRC/38/35, para. 46).
In case decision 2020-006-FB-FBR, the Board found that it was “difficult for users to understand what content relating to health misinformation is prohibited” under Facebook’s Community Standards considering the “patchwork” of relevant of rules (including misinformation that contributes to a risk of imminent harm under “Violence and Incitement”). The Board also noted the lack of public definitions of key terms such as “misinformation,” concluding this made the Violence and Incitement Community Standard “inappropriately vague” as it applied to misinformation. In this regard, the UN Rapporteur on freedom of expression has stated that the principle of legality should be applied “to any approach” to misinformation because it is a “extraordinarily elusive concept to define in law, susceptible to providing executive authorities with excessive discretion” (A/HRC/44/49, para. 42). To address these issues, the Board recommended that Facebook “set out a clear and accessible Community Standard on health misinformation, consolidating and clarifying existing rules in one place.”
In response to the Board’s recommendation, Facebook published the Help Center article “ COVID-19 and Vaccine Policy Updates Protections,” which is linked to the misinformation and harm policy under the Violence and Incitement Community Standard. In this article, Facebook lists all relevant COVID-19 and vaccine policies from various Community Standards and provides examples of content types that are violating. This article is also available in Portuguese.
While the Help Center article provides useful information for users to understand how the policy is enforced, it also adds to the number of sources of rules outside the Community Standards. Additionally, the article is not sufficiently “made accessible to the public” (General Comment 34, para. 25), considering it is only accessible to people with a Facebook log-in. Moreover, it is only linked from the Community Standard on Violence and Incitement, and not from other applicable Community Standards or the announcement on COVID-19 in the introduction to the Community Standards.
The Board also reiterates the point made in section 5 above that Facebook does not provide users with sufficient information to submit a statement to the Board.
II. Legitimate aim
Any restriction on freedom of expression should also pursue a "legitimate aim." Facebook has a responsibility to ensure its rules comply with the principle of legitimacy (A/HRC/38/35, para. 45). The ICCPR lists legitimate aims in Article 19, para. 3, which includes the protection of the rights of others as well as protection of public health.
III. Necessity and proportionality
Any restrictions on freedom of expression "must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected" (General Comment 34, para. 34). Facebook has a responsibility to ensure its rules respect the principles of necessity and proportionality (A/HRC/38/35, para. 47).
The Board assessed whether the content removal was necessary to protect public health and the right to health, in line with Facebook’s human rights responsibilities. The content was shared by the page of a medical council, a part of the state government administration that may, through the information it shares, influence decisions of other public authorities and the behavior of the general public.
The Board notes that it is relevant for Facebook to consider whether a page or account is administered by a public institution, as it is in this case, because those institutions should “not make, sponsor, encourage or further disseminate statements which they know or reasonably should know to be false” or which “demonstrate a reckless disregard for verifiable information” (UN Special Rapporteur on freedom of expression, report A/HRC/44/49, para. 44; Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda, FOM.GAL/3/17, paras. 2 (c)). Further, state actors should, “in accordance with their domestic and international legal obligations and their public duties, take care to ensure that they disseminate reliable and trustworthy information, including about matters of public interest, such as the economy, public health, security and the environment” (ibid., para 2(d)) This duty is particularly strong when the information is related to the right to health, especially during a global pandemic.
A minority is of the view that the standard quoted from the Joint Declaration is not applicable in the present case and the definition used in the Joint Declaration is contradicted by other authorities of international human rights law. The standard of the Joint Declaration refers to disinformation by public institutions, while in the present case the Decision expressly qualifies the impugned statement to be a misinformation. As emphasized by the Special Rapporteur, the interchangeable use of the two concepts endangers the right to freedom of expression (A/HRC/47/25, para 14) – and “disinformation is understood as false information that is disseminated intentionally to cause serious social harm and misinformation as the dissemination of false information unknowingly. The terms are not used interchangeably.” (para 15). In the present case, it has not been shown that the user, a medical council reasonably should have known that the disseminated statement is false. The minority believes that while the statement contains some inaccurate information, as a whole it is a fact related opinion which is legitimate in public discussion. The efficacy of lockdowns, while widely accepted among experts and public health agencies in most of the world, is subject to reasonable debate. Moreover, while the council is part of the public administration, it cannot be held in the present context to be a state actor as its powers are limited to its members and it is not a public authority having the legal power to influence or determine a lockdown decision.
The majority understands the minority’s view but respectfully disagrees with it. According to the standards above, public authorities have a duty to verify information they provide to the public. This duty is not lost when the false information disseminated is not directly related to its statutory duties.
Facebook argued that the threshold of imminent physical harm was not reached in this case because health authorities such as the World Health Organization and other experts have recommended the company to remove misinformation on practices such as social distancing, but they have not done the same with respect to lockdowns. Additionally, the Board notes that the content in this case was not used as a basis by the council for the adoption of public health measures that could create risks, since the council does not have authority to decide on these matters. For these reasons and following the Board’s analysis in case decision 2020-006-FB-FBR, the Board considers Facebook’s decision to keep the content on the platform to be justified, given that the threshold of imminent physical harm was not met. However, as already mentioned, the Board notes that the dissemination of misinformation on public health can affect trust in public information and the effectiveness of certain measures that, in the words of the World Health Organization, may be essential in certain contexts. In these cases, as the UN Special Rapporteur on Freedom of Expression suggested, the damage caused by false or misleading information can be mitigated by the sharing of reliable information (A/HRC/44/49, para. 6). Those alternative or less intrusive measures can provide the public with greater context and promote their right to access accurate health-related information. In this particular case, Facebook should provide the public with more context about the statements of Dr. Nabarro and the World Health Organization’s stance on lockdowns mentioned above.
The Board recalls that in case decision 2020-006-FB-FBR it recommended that Facebook should consider less intrusive measures than removals for misinformation that may lead to forms of physical harm that are not imminent. These measures are provided for in the False News Community Standard – as noted above in section 8.1. The Board recommends that Facebook should prioritize referring content that comes to its attention to its fact-checking partners where a public position on debated health policy issues (in particular in the context of a pandemic) is presented by a part of state government administration normally capable of influencing public opinion and individual health-related conduct. The Board recognizes that Facebook’s approach to fact-checking has been criticized, but because fact-checkers did not review this post, this case is not a proper occasion to consider those issues.
9. Oversight Board decision
The Oversight Board upholds Facebook's decision to keep the content on the platform.
10.Policy advisory statement
Implementing the Board’s recommendation from case decision 2020-006-FB-FBR
1. Facebook should conduct a proportionality analysis to identify a range of less intrusive measures than removing the content. When necessary, the least intrusive measures should be used where content related to COVID-19 distorts the advice of international health authorities and where a potential for physical harm is identified but is not imminent. Recommended measures include: (a) labeling content to alert users to the disputed nature of a post's content and to provide links to the views of the World Health Organization and national health authorities; (b) introducing friction to posts to prevent interactions or sharing; and (c) down-ranking, to reduce visibility in other users’ News Feeds. All these enforcement measures should be clearly communicated to all users, and subject to appeal.
Prioritizing the fact-checking of content flagged as health misinformation
2. Given the context of the COVID-19 pandemic, Facebook should make technical arrangements to prioritize fact-checking of potential health misinformation shared by public authorities which comes to the company’s attention, taking into consideration the local context.
Clarity on eligibility for fact-checking
3. Facebook should provide more transparency within the False News Community Standard regarding when content is eligible for fact-checking, including whether public institutions' accounts are subject to fact-checking.
*Procedural note:
The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.
For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context.