Multiple Case Decision

Statements Targeting Indigenous Australians

A user appealed Meta’s decisions to leave up two Facebook posts, both shared by another user, which respond to news articles with commentary targeting the Indigenous population of Australia.

2 cases included in this bundle

Overturned

FB-CRZUPEP1

Case about hate speech on Facebook

Platform
Facebook
Topic
Discrimination,Marginalized communities,Race and ethnicity
Standard
Hate speech
Location
Australia
Date
Published on August 1, 2024
Overturned

FB-XJP78ARB

Case about hate speech on Facebook

Platform
Facebook
Topic
Discrimination,Marginalized communities,Race and ethnicity
Standard
Hate speech
Location
Australia
Date
Published on August 1, 2024

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decisions to leave up two Facebook posts, both shared by a single user, which respond to news articles with commentary targeting the Indigenous population of Australia. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and removed both posts.

About the Cases

Between December 2023 and January 2024, an Australian user shared two Facebook posts about Indigenous Australians. The first post contains a link to an article detailing an Indigenous land council’s effort to buy land in a park in one of Sydney’s suburbs. The post’s caption calls on Indigenous people to “bugger off to the desert where they actually belong.” The second post shares an article about a car chase in northeastern Australia. The caption of the post calls for “Aboriginal ratbags” to serve prison time along with receiving “100 strokes of the cane.”

Meta’s Hate Speech policy prohibits statements that support or advocate for the segregation or exclusion of people on the basis of race and ethnicity. Meta specifically prohibits content that explicitly calls for “expelling certain groups” and content that supports “denying access to spaces (physical and online).” The policy also bans “targeted cursing” and “generalizations that state inferiority,” including “mental characteristics” directed at a person or group of people based on their protected characteristic(s).

After the Board brought this case to Meta’s attention, the company determined that both pieces of content violated its Hate Speech policy and the original decisions to leave both pieces of content up were incorrect. The company then removed the content from Facebook.

Meta explained to the Board that the post calls for exclusion of Indigenous Australians from the parkland, and that the phrase “bugger off” in reference to them is an example of targeted cursing against members of a protected group. Furthermore, Meta acknowledged that the term “ratbag” is derogatory, with meanings that include “stupid person” in Australian English, therefore violating Meta’s Hate Speech policy prohibition on statements referring to members of a protected characteristic group as mentally inferior.

Board Authority and Scope

The Board has authority to review Meta's decisions following appeals from the users who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Cases

The Board has repeatedly emphasized the particular importance of addressing hate speech targeted at groups that have historically been and continue to be discriminated against ( South Africa Slurs and Post in Polish Targeting Trans People decisions). The Board has also raised serious concerns that Meta’s enforcement practices may disproportionately impact First Nations peoples. In the Wampum Belt decision, the Board noted that while mistakes are inevitable, “the types of mistakes and the people or communities who bear the burden of those mistakes reflect design choices that must constantly be assessed and examined.” In that case, the Board emphasized the importance of Meta monitoring the accuracy of its hate speech enforcement not only generally but with particular sensitivity to enforcement errors for “subcategories of content where incorrect decisions have a particularly pronounced impact on human rights.” The Board explained that it was therefore “incumbent on Meta to demonstrate that it has undertaken human rights due diligence to ensure its systems are operating fairly and are not exacerbating historical and ongoing oppression.”

On calls for exclusion, the Board recommended that Meta “should rewrite Meta’s value of 'Safety' to reflect that online speech may pose risk to the physical security of persons and the right to life, in addition to the risks of intimidation, exclusion and silencing,” ( Alleged Crimes in Raya Kobo, recommendation no. 1). Implementation of this recommendation has been demonstrated through published information.

Decision
The Board overturns Meta’s original decisions to leave up the content. The Board acknowledges Meta’s correction of its initial errors once the Board brought the cases to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions