In October, the Board accepted a request from Meta, in the form of a policy advisory opinion, to review the company’s cross-check system and make recommendations on how it can be changed. Today, the Board is opening public comments for this policy advisory opinion.
Over the last several weeks, media reporting has drawn renewed attention to the seemingly inconsistent way in which Facebook makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users. As part of the Board's commitment to transparency, today we are publishing our first quarterly transparency reports.
The Oversight Board has upheld Facebook’s decision to remove a post discussing South African society under its Hate Speech Community Standard. The Board found that the post contained a slur which, in the South African context, was degrading, excluding and harmful to the people it targeted.
The Oversight Board has overturned Facebook’s decision to remove a post showing a video of protesters in Colombia criticizing the country’s president, Ivan Duque. In the video, the protesters use a word designated as a slur under Facebook’s Hate Speech Community Standard. Assessing the public interest value of this content, the Board found that Facebook should have applied the newsworthiness allowance in this case.
Last week, new information emerged on Facebook’s ‘cross-check’ system, which the company uses to review content decisions relating to some high-profile users. This information came to light due to the reporting of the Wall Street Journal, and we are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission.
The Oversight Board agrees that Facebook was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas. Facebook originally removed the content under the Dangerous Individuals and Organizations Community Standard, and restored it after the Board selected this case for review.
The Oversight Board has upheld Facebook’s decision to leave up a post by a state-level medical council in Brazil which claimed that lockdowns are ineffective and had been condemned by the World Health Organization (WHO).
The Oversight Board has overturned Facebook’s decision to remove a post in Burmese under its Hate Speech Community Standard. The Board found that the post did not target Chinese people, but the Chinese state. Specifically, it used profanity to reference Chinese governmental policy in Hong Kong as part of a political discussion on the Chinese government’s role in Myanmar.
The Oversight Board has overturned Facebook’s original decision to remove an Instagram post encouraging people to discuss the solitary confinement of Abdullah Öcalan, a founding member of the Kurdistan Workers’ Party (PKK). After the user appealed and the Board selected the case for review, Facebook concluded that the content was removed in error and restored it.
The Oversight Board has announced a new case. In May, a Facebook user in Egypt shared a post by a verified Al Jazeera news page about the escalating violence in Israel and the Occupied Palestinian Territories.
Today, the Board announced it has accepted a policy advisory opinion request from Facebook on the sharing of private residential information. As part of this, the Board has issued a call for public comments.
The Oversight Board has overturned Facebook’s decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook removed the comment for using the word “cowardly” which was construed as a negative character claim.
The Oversight Board has overturned Facebook’s decision to remove a comment under its Hate Speech Community Standard. A majority of the Board found it fell into Facebook’s exception for content condemning or raising awareness of hatred.
The Board has upheld Facebook’s decision on January 7, 2021, to restrict then-President Donald Trump’s access to posting content on his Facebook page and Instagram account. However, it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension. Facebook’s normal penalties include removing the violating content, imposing a time-bound period of suspension, or permanently disabling the page and account.
The Oversight Board has overturned Facebook’s decision to remove a post under its Dangerous Individuals and Organizations Community Standard. After the Board identified this case for review, Facebook restored the content. The Board expressed concerns that Facebook did not review the user’s appeal against its original decision. The Board also urged the company to take action to avoid mistakes which silence the voices of religious minorities.
The Oversight Board has upheld Facebook’s decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard.
The Oversight Board has overturned Facebook’s decision to remove a post under its Violence and Incitement Community Standard. While the company considered that the post contained a veiled threat, a majority of the Board believed it should be restored. This decision should only be implemented pending user notification and consent.
Following the publication of our first case decisions, the Board is announcing its next cases and opening the public comments process. Two cases have been selected by the Board, including the case accepted by the Board last week relating to former US President Trump’s indefinite suspension from Facebook and Instagram.
Today, the Oversight Board is announcing its first decisions. In the five case decisions published today, the Board overturned four of Facebook’s decisions, upheld one and issued nine policy recommendations to the company. The cases covered four continents: Asia, Europe, North America and South America. None of these cases had easy answers and deliberations revealed the enormous complexity of the issues involved.
The Oversight Board has overturned Facebook’s decision to remove a post which it claimed, “contributes to the risk of imminent… physical harm.” The Board found Facebook’s misinformation and imminent harm rule (part of its Violence and Incitement Community Standard) to be inappropriately vague and recommended, among other things, that the company create a new Community Standard on health misinformation.
The Oversight Board has overturned Facebook’s decision to remove a post which the company claims violated its Community Standard on Dangerous Individuals and Organizations. The Board found that these rules were not made sufficiently clear to users.
The Oversight Board has overturned Facebook’s decision to remove a post on Instagram. After the Board selected this case, Facebook restored the content. Facebook’s automated systems originally removed the post for violating the company’s Community Standard on Adult Nudity and Sexual Activity. The Board found that the post was allowed under a policy exception for “breast cancer awareness” and Facebook’s automated moderation in this case raises important human rights concerns.
The Oversight Board has overturned Facebook’s decision to remove a post under its Hate Speech Community Standard. The Board found that, while the post might be considered offensive, it did not reach the level of hate speech.
Today the Oversight Board accepted a case referral from Facebook to examine their decision to indefinitely suspend former US President Donald Trump’s access to post content on Facebook and Instagram. Facebook has also requested policy recommendations from the Board on suspensions when the user is a political leader.
Today the Oversight Board is releasing the outcome of a human rights report requested by the Board and delivered by the non-profit organization Business for Social Responsibility (BSR). We are also publishing our procedures for how Board Members select and review cases, as well as how they make policy recommendations to Facebook.
Today we're announcing an important milestone in the progress of the Oversight Board. From today, if your content is removed from Facebook or Instagram and you have exhausted the company's appeal process, you can challenge this decision by appealing to the Oversight Board. Similarly, Facebook can now refer cases for a decision about whether content should remain up or come down. In the coming months you will also be able to appeal to the Board about content you want Facebook to remove.
Today the impact of social media on people’s lives is hard to grasp. This can often be positive. As the world lives through a global health crisis, social media has become a lifeline for helping people and communities to stay connected become a lifeline for helping people and communities to stay connected.