A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.

Announcing the Board's next cases and changes to our Bylaws


November 2021

Today, the Board is announcing three new cases for consideration, as well as changes to our Bylaws.

Case selectionCase selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about the policies of Facebook, now called Meta.

The cases we are announcing today are:

2021-015-FB-UA

User appeal to restore content to Facebook

Submit public comment here.

In June 2021, a Facebook user in the United States posted in a private group for adults with attention deficit hyperactivity disorder (ADHD). The post consists of text in English, with the user beginning the post by writing “CW” (Content Warning) on “Medication, addiction.” The user identifies themselves as someone with ADHD and asks the group how to approach speaking with a doctor about specific medication. The user states that the medication Adderall has worked for them in the past, while other medications “zombie me out,” but they are concerned about presenting as someone with drug-seeking behavior if they directly ask their doctor for a prescription. The post had comments from group members describing their own experiences and providing advice on how to explain the situation to a doctor. The group administrators are based in Canada and New Zealand.

Meta removed the content under Facebook's Regulated Goods Community Standard. As a result of the Board selecting this case, Meta identified its removal as an “enforcement error” and restored the content. At the time of removal, the content had been viewed over 700 times, and it had not been shared. No users reported the content.

Under Facebook's Regulated Goods policy, Meta takes down content that “attempts to buy, sell or trade pharmaceutical drugs…[or] asks for pharmaceutical drugs except when content discusses the affordability, accessibility or efficacy of pharmaceutical drugs in a medical context.”

In their appeal to the Board, the user stated that they are a patient seeking advice on how best to discuss an important issue with their doctor. The user called attention to the importance of having open and mature conversations on health care issues and issues of mental health.

The Board would appreciate public comments that address:

  • Whether Meta’s initial decision to remove the post was consistent with Facebook’s Regulated Goods Community Standard and stated values.
  • Whether Meta’s initial decision to remove the post was consistent with the company’s human rights responsibilities and commitments.
  • How Meta’s content moderation, including the use of automation, impacts freedom of expression and access to information on health care, and how negative impacts may be prevented or mitigated.
  • The impact of Facebook’s content policies and their enforcement on discussions about pharmaceutical drugs, as well as users’ ability to share and discuss mental health issues, including the potential stigmatization of mental health issues.
  • The role and impact of social media on abuse of prescription drugs.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

2021-016-FB-FBR

Case referred by Meta

Submit public comment here.

Note: Please be aware before reading that the following case summary includes potentially sensitive material relating to content about sexual violence against minors. Certain details from the content under review are abstracted in this summary to protect the interests of child victims.

On September 20, 2021, Meta referred a case to the Board concerning a Swedish journalist reporting on sexual violence against minors. The content was posted to the journalist’s “verified” Facebook page in Swedish in mid-2019.

The content contains details about the rapes of two unnamed minors, specifying their ages, and the municipality in which the first crimes had occurred. The post also details the convictions that two unnamed perpetrators received for those crimes. One of those perpetrators reportedly received a non-custodial sentence as they were a minor at the time they committed the offence. The perpetrator in the other case was reported as having recently completed a custodial sentence for a violent crime against another woman. The post argues that the Swedish criminal justice system is too lenient and incentivizes crime. They advocate for the establishment of a sex offenders register in the country.

The content provides extensive and graphic details of the harmful impact of the crime on the first victim, including describing their physical and mental injuries, offline and online harassment they encountered, as well as the psychological support they received. The post also provides graphic quotes attributed to the perpetrator reportedly bragging to friends about the rape and referring to the minor in sexually explicit terms.

The post was viewed more than 14,000 times, receiving more than 1,800 comments and more than 10,000 reactions. One user reported the content in September 2019 as bullying and harassment, leading to an automated review that assessed the content as non-violating and left it up. Facebook’s automated systems later detected the post as potentially violating in August 2021, and a content reviewer assessed the post as violating the Community Standards and it was removed. The content was therefore on the platform for approximately two years.

Meta removed the content for violating Facebook's policy on Child Sexual Exploitation, Abuse and Nudity. Under this policy, Meta removes content that, among other things, “shows children in a sexualized context.” Meta explained in its referral to the Board that the post was in violation of this policy because it “describes how the attacker viewed the minor in sexually explicit terms.”

In its referral, Meta stated that the content decision is difficult because it highlights the tension and challenges the company “confronts when balancing the values of safety, dignity, and voice.” Meta also noted that the case is significant because “the user is a well-known investigative journalist, and he posted about a crime of public interest.” Meta further indicated that while it is important that users can “raise awareness of crimes, atrocities, and violations of human rights on Facebook,” it is also important that Facebook does “not serve as a platform for re-traumatizing victims of these crimes or facilitating their harassment.”

The Board has not received a statement from the user as of the publication of this summary.

The Board would appreciate public comments that address:

  • Whether Meta's decision to remove the post is consistent with Facebook's Child Sexual Exploitation, Abuse and Nudity Community Standard, and Facebook's stated values and human rights responsibilities and commitments.
  • Whether Facebook’s policies and their enforcement adequately protect the identities and rights of child victims of sexual crimes, including protecting against retraumatizing those victims, while also enabling public interest commentary about such crimes and the criminal justice system.
  • Whether Meta’s design choices incentivize sensationalist reporting on issues impacting children’s rights, if or how Meta should respond to such impacts, and the relevance of ethical journalism standards in this regard.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

2021-017-FB-UA

User appeal to restore content to Facebook

Submit public comment here.

In August 2021 a user in Afghanistan who claims to be a journalist posted in Dari on their Facebook timeline about the Taliban takeover of the country, the group’s new responsibility to govern, and the challenges ahead.

The user begins by stating that the Taliban’s goal was “sacred and virtuous, at least to their way of thinking,” that they fought to “liberate Afghanistan from what they called occupation and colonization,” and that they have achieved their goal.

The user goes on to say that the Taliban now have the responsibility of ruling the country and discusses the challenges of such a transition, in particular one that runs counter to global demands for democracy. The user says that “capitalists, businessmen, and other financial resources have fled the country,” leaving a broken economy and a society running out of resources. If international aid stops, and Afghanistan’s banking system is not restored, the user states, then “chaos and civil war will ensue.” They go on to say that Afghanistan is in “the calm before the storm” and that a society that has no food has nothing to lose and will “strive to overthrow the system.”

The user then notes that, while the security situation in Afghanistan “may seem adequate today,” the Taliban must establish an administrative framework as quickly as possible, clarify the role of banks and develop an economic strategy. The user also notes that, if the Taliban can ensure security and improve the economy, they will by far be more successful than previous governments. They conclude by saying that a society’s main demand is food and security, and that their absence leads to chaos.

The post was viewed over 600 times, received over 20 reactions, and was commented on fewer than five times. According to Meta, the user’s account is located in Kabul, Afghanistan. The post was automatically reported. Meta then removed the post under Facebook's Dangerous Individuals and Organizations Community Standard. As a result of the Board selecting this case, Meta later identified the post’s removal as an “enforcement error” and restored it to the platform.

Under Tier 1 of the Dangerous Individuals and Organizations Community Standard Facebook prohibits content that praises, substantively supports, or represents terrorist organizations that engage in serious offline harms, including organizing or advocating for violence against civilians. The public-facing Standard was updated on August 26, 2021, to specify that it includes entities and individuals designated by the United States government as Foreign Terrorist Organizations (FTOs) or Specially Designated Global Terrorists (SDGTs). Facebook has designated the Taliban as a Dangerous Organization under its policies.

While the content in this case was in Dari, the user submitted their appeal to the Board in English. In their statement, the user explains that they are a journalist and that their post was intended to be an analytical and critical assessment of the situation in Afghanistan. They do not explain if they work for a media organization or if they are a freelance journalist. They claim that their post was not terrorism and that they were simply providing information about the future of their country. The user states that this was the third time that Facebook restricted their account despite them being very careful. The user claims that their treatment suggests that Facebook’s Persian-language staff “are not impartial.”

The Board would appreciate public comments that address:

  • Whether Meta’s initial decision to remove the post is consistent with Facebook’s Dangerous Individuals and Organizations Community Standard, its stated values and human rights responsibilities and commitments.
  • Content moderation challenges specific to Afghanistan and languages spoken in the country.
  • How content moderation policies impact public discourse in Afghanistan, before and after the Taliban takeover.
  • The safety of journalists and extent of media freedom in Afghanistan since the Taliban takeover, and how these factors affect reporting about the Taliban and the public’s access to information on the political and security challenges facing the country.
  • Whether Facebook’s Dangerous Individuals and Organizations Community Standard unnecessarily limits discussion of designated groups that either form or take the place of governments.
  • The relationship between US law prohibiting material support of designated terrorist organizations and Facebook’s content policies, and how this may impact freedom of expression globally.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public commentsPublic comments

If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 15:00 UTC on Tuesday, November 16.

Changes to our BylawsChanges to our Bylaws

The Board’s work over the last year has given us new insights into how to best serve users in the way we select and decide cases. As such, today we are announcing several changes to our Bylaws.

Under the revised Bylaws, we have extended the timeframe for selecting cases from 60 days to 90 days from when they are referred to us, allowing for more meaningful analysis of appeals. In addition, the 90-day clock for publishing our decisions will now start when we publish our selection of a case, rather than when it is assigned to panel. The cases announced today will be the last cases to operate under the previous timetable. Going forward, the timing set out in this latest version of the Bylaws will apply.

We have also made changes to provide us with more time to render a final decision when faced with technical or operational issues. Finally, we have amended the Bylaws to be able to publish information about case panel participation and recusals, while not naming specific Board Members.

These changes to the Bylaws were approved prior to the announcement last week of Facebook’s name change. While we expect to update our website and governing documents to reflect the new name in due course, those changes are not reflected in the version of the Bylaws released today.

What’s nextWhat’s next

In the coming weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.

Attachments

Oversight Board Bylaws
Download
Back to news and articles