Overturned

Comment Targeting People with Down Syndrome

A user appealed Meta’s decision to leave up a Facebook comment targeting individuals with Down syndrome and other disabilities.

Type of Decision

Summary

Policies and Topics

Topic
Discrimination, Marginalized communities
Community Standard
Bullying and harassment, Hate speech

Region/Countries

Location
United States

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to leave up a Facebook comment targeting individuals with Down syndrome and other disabilities. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the comment.

About the Case

In February, a Facebook user commented on a post containing a Netflix advertisement for the show “Love on the Spectrum.” The comment included statements in which the user said they see people on the spectrum as “a different species of human.” In the same comment, the user also mentions, by name, a specific individual they know and states, “this girl … was also the only fat kid in class.” The user adds that people like this individual are “difficult to interact with ... they are a different kind. A different form of human.”

The user who appealed to the Board Meta's original decision to leave up this comment characterized it as a “literal tirade against individuals with Down syndrome.” The appealing user noted that “this should not require an entire essay to explain why this comment should have been flagged by the Facebook system” and urged Meta to “do better.”

According to Meta’s Hateful Conduct policy, the company removes dehumanizing speech targeting people on the basis of protected characteristics, such as a disability. This includes comparisons with or generalizations about: “Subhumanity (including but not limited to: savages, devils, monsters).” In the comment, the user generalizes, labeling people with Down syndrome as “a different species of human.”

Additionally, Meta’s Bullying and Harassment policy prohibits content that targets a specific person with “statements of inferiority about their physical appearance .” The user’s description of a specific person as “the only fat kid” qualifies as a statement of inferiority about physical appearance. This is a violation under Tier 1 of the policy, which provides “universal protections for everyone.” The targeted person does not need to report the content themselves for it to constitute a violation and be removed.

After the Board brought this case to Meta’s attention, the company determined that the content violated both the Hateful Conduct and Bullying and Harassment policies and that its original decision to leave up the comment was incorrect. Meta considered that the user violated the Hateful Conduct policy because they labeled people with Down syndrome as “a different species of human,” and the Bullying and Harassment policy because the user described a specific individual as “the only fat kid.” The company then removed the content from Facebook.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case is a particularly blatant example of dehumanizing speech about people with disabilities. It is not subtle or coded, stating that Down syndrome makes a person “a different species of human.” That it was not removed suggests a serious issue with Meta’s enforcement systems.

The case thus highlights Meta's failure to effectively enforce its policies against hateful conduct. Recent reports indicate that online harassment of people with disabilities, such as those with Down syndrome, continues to increase significantly. According to the report on Countering Cyberbullying Against Persons with Disabilities from the Office of the United Nations High Commissioner for Human Rights (OHCHR), individuals with disabilities are “significantly more likely to experience cyberbullying” and “may even withdraw from digital spaces altogether as a result of online abuse.” The theme of the World Down Syndrome Day 2024 campaign was, “Calls for people around the world to end the stereotypes. #EndTheStereotypes.’’ The campaign stressed that, for people with Down syndrome and intellectual disabilities, stereotypes can prevent them from being treated with respect.

The Board has issued recommendations aimed at improving Meta’s policy enforcement to reduce errors. The Board has urged the company to continuously improve its ability to detect content that violates its Hate Speech (now Hateful Conduct) Community Standard.

For instance, the Board has recommended that Meta should “share [with the public] the results of the internal audits it conducts to assess the accuracy of human review and performance of automated systems in the enforcement of its Hate Speech [now Hateful Conduct] policy,’’ ( Criminal Allegations Based on Nationality, recommendation no. 2). In its initial response to the Board, Meta reported that the company will implement this recommendation in part. Meta stated that, while the company “will continue to share data on the amount of hate speech content addressed by [its] detection and enforcement mechanisms in the Community Standards Enforcement Report (CSER),” data on the accuracy of its enforcement on a global scale will be confidentially shared with the Board. This recommendation was issued in September 2024. The implementation is in progress, with data yet to be shared with the Board.

The Board is concerned that Meta has not publicly shared what, if any, human rights due diligence it performed prior to the policy and enforcement changes announced on January 7, 2025, as highlighted by the Board in the Criticism of EU Migration Policies and Immigrants, Posts Displaying South Africa’s Apartheid-Era Flag, Gender Identity Debate Videos and Posts Supporting UK Riots decisions. A less proactive enforcement approach may result in a higher prevalence of content targeting members of vulnerable groups, such as the post under review in this decision. In those decisions, the Board has emphasized that “[i]n relation to the enforcement changes, due diligence should be mindful of the possibilities of both overenforcement ( Call for Women’s Protest in Cuba, Reclaiming Arabic Words) as well as underenforcement ( Holocaust Denial, Homophobic Violence in West Africa, Post in Polish Targeting Trans People).” Also, in both decisions, the Board highlighted the importance of Meta ensuring that “adverse impacts of these changes on human rights are identified, mitigated and prevented, and publicly reported.”

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions