Description du cas
The Oversight Board will address the three cases below together, choosing either to uphold or overturn Meta’s decisions on a case-by-case basis.
Three Facebook users shared content during the UK riots that took place between July 30 and August 7, 2024. The riots were sparked by a knife attack that occurred on July 29 in Southport, England, in which three girls children were killed and eight others injured at a Taylor Swift-themed dance workshop. In the aftermath of the attack, misinformation about the attacker’s identity, wrongly suggesting he was a Muslim asylum seeker or Muslim, spread rapidly on social media.. The ensuing violence and disorder involved thousands of people, including with groups identified as far-right and anti-immigration groups,king the damage of property such as including refugee centers and hotels housing immigrants, and injuries sustained by police officers.
The first post expressed agreement with the riots, calling for more mosques to be smashed and buildings to be set on fire where “scum are living,” also referring to “migrants, terrorist.” The person posting the content acknowledges the riots have damaged private property and injured police officers, but argues that without this violence, the authorities will not listen and put a stop to “all the scum coming into Britian.” Finally, the post reminded readers of the murders, stating the three girls will not be the last victims if the public don’t do something. This post was viewed more than 1,000 times and had less than 50 comments.
The second piece of content is a reshare of another post. It contains what looks like an AI-generated image, this time of a giant man wearing a union jack (the UK’s flag) T-shirt who is chasing several Muslim men. The image has text overlay providing a time and place to gather for one of the protests, and includes the hashtag “EnoughIsEnough.” This content has had fewer than 1,000 views.
The third post is a repost of another likely AI-generated image of four Muslim men wearing white kurtas (tunics), running in front of the Houses of Parliament after a crying blond-haired toddler in a union jack T-shirt. One of the men waves a knife, while above a plane flies towards Big Ben. The image is accompanied by the caption, “wake up”. This piece of content also had over 1,000 views and less than 50 comments.
All three were reported by other Facebook users for violating either the Hate Speech or Violence and Incitement policies. After assessments by Meta’s automated tools, all three posts were kept up on Facebook. The users who reported the content appealed against the posts remaining up but Meta’s automated systems confirmed the initial decisions. None of the posts were assessed by humans. The same users then appealed to the Board, stating that the content is either inciting violence against migrants, promoting the narrative that immigrants are to blame for the Southport murders or encouraging people to attend the riots.
The Board selected these cases to examine Meta’s policy preparedness and crisis response to violent riots targeting migrant and Muslim communities. This case falls within the Board’s strategic priority of Hate Speech Against Marginalized Groups.
As a result of the Board selecting these cases, Meta determined that its previous decision to leave the first post on Facebook was an error. The company removed the post under its Violence and Incitement policy. Meta confirmed its decisions to leave the second and third post on Facebook were correct. According to the company, for the second post (giant man), there was no violation of its Violence and Incitement policy because the image did not constitute calls for violence against a target. On the third post (four men running after a toddler) did not violate the Hate Speech policy because interpreted the the image as refersreferring to a specific Muslim man or men – and not all Muslim people.
Given the misinformation, that the perpetrator of the Southport attack was a Muslim or immigrant, circulating widely on social media at the time, Meta determined the image should be interpreted as a reference to a specific crime – rather than dehumanizing speech targeting Muslim people or Muslim immigrants. The company added that its approach to enforcement in a protest context is to favor maximum protection for voice.
The Board would appreciate public comments that address:
- The role social media played in the 2024 UK riots, including the spread of misinformation, in organizing riots or informing the public.
- Social and political discourse on immigration and migrants in the UK.
- The appropriate balance between voice and safety in the protest settings in different contexts, including any specific triggers or limits that should be considered.
- Any documented links between anti-immigrant and anti-Muslim speech and violence and discrimination.
- The role that imagery (i.e., pictures, graphics, memes, videos, including AI generated) plays in online hate speech.
- Challenges for automation in assessing incitement or hate speech in imagery, especially AI-generated.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
Commentaires
I feel Meta (and other social media platforms) failed to prevent disinformation being spread to incite violence and hatred. There were actors using the Meta platform to stir up the public many of these have blue ticks and one claims he 'was only asking questions' when he reposted information.
My trust in Meta to handle disinformation has seriously declined and I believe if they do not curtail the publishing of knowingly false information Meta will be part of the rise of the far right or other hate groups. It seems very clear Meta completely failed in this case.
CONTEXT of the post has to be considered. To ban a post outright not only deprives the public of valid news for them to make their own decisions, but the lack of such coverage may place local people in a dangerous position if they don't know what's happening nearby.
But straightforward current events can be misrepresented in a way to create bias and incite hate and violence, and still skirts the "community standards" because it presents it's poison message by inference, and not directly. (I've flagged posts on several current events that do exactly this, but been told the "do not violate community standards."). This is common with posts about the Middle East, India and Pakistan, the US and Russia, and other conflicts around the world.
So it also goes without saying, whatever the standard is, it has to be a standard that can be applied equally to all kinds of similar issues on all fronts, not just the UK riots.
Hello. Unfortunately, everyday life for ordinary people in Europe and America has encountered problems that cannot be ignored. I am a Muslim, but I disagree with the behavior of many Muslims. You Muslim brothers and sisters should understand that you are guests in that country, and respect is essential. Please also respect other religions and beliefs.