Oversight Board Overturns Meta’s Decision in “UK Drill Music” Case
November 22, 2022
The Oversight Board has overturned Meta’s decision to remove a UK drill music video clip from Instagram. Meta originally removed the content following a request from the Metropolitan Police. This case raises concerns about Meta’s relationships with law enforcement, which has the potential to amplify bias. The Board makes recommendations to improve respect for due process and transparency in these relationships.
UPDATE JANUARY 2023
In this case, the Oversight Board submitted a Freedom of Information request to the Metropolitan Police Service (MPS), with questions on the nature and volume of requests the MPS made to social media companies, including Meta, to review or remove drill music content over a one-year period. The MPS responded on October 7, 2022, providing figures on the number of requests sent, and the number that resulted in removals. The MPS response was published in full alongside the decision, and figures it contained were included in the Board's decision, published on November 22, 2022. On January 4, 2023, the MPS contacted the Board to say it had identified errors in its response, and corrected them. Notably, it corrected the figures to: all of the 992 requests [corrected from 286 requests] the Metropolitan Police made to social media companies and streaming services to review or remove content between June 2021 and May 2022 involved drill music; those requests resulted in 879 removals [corrected from 255 removals]; 28 requests related to Meta’s platforms [corrected from 21 requests], resulting in 24 removals [corrected from 14 removals]. The decision contains the original figures prior to the MPS corrections. This update does not change the Oversight Board’s analysis or decision in this case. The updated Freedom of Information response can be found here.
About the Case
In January 2022, an Instagram account that describes itself as publicizing British music posted content highlighting the release of the UK drill music track, "Secrets Not Safe" by Chinx (OS), including a clip of the track’s music video.
Shortly after, the Metropolitan Police, which is responsible for law enforcement in Greater London, emailed Meta requesting that the company review all content containing "Secrets Not Safe.” Meta also received additional context from the Metropolitan Police. According to Meta, this covered information on gang violence, including murders, in London, and the Police’s concern that the track could lead to further retaliatory violence.
Meta’s specialist teams reviewed the content. Relying on the context provided by the Metropolitan Police, they found that it contained a “veiled threat,” by referencing a shooting in 2017, which could potentially lead to further violence. The company removed the content from the account under review for violating its Violence and Incitement policy. It also removed 52 pieces of content containing the track “Secrets Not Safe” from other accounts, including Chinx (OS)’s. Meta’s automated systems later removed the content another 112 times.
Meta referred this case to the Board. The Board requested that Meta also refer Chinx (OS)’s post of the content. However, Meta said that this was impossible as removing the “Secrets Not Safe” video from Chinx (OS)’s account ultimately led to the account being deleted, and its content was not preserved.
Key Findings
The Board finds that removing this content does not align with Meta’s Community Standards, its values, or its human rights responsibilities.
Meta lacked sufficient evidence to conclude that the content contained a credible threat, and the Board’s own review did not uncover evidence to support such a finding. In the absence of such evidence, Meta should have given more weight to the content’s artistic nature.
This case raises concerns about Meta’s relationships with governments, particularly where law enforcement requests lead to lawful content being reviewed against the Community Standards and removed. While law enforcement can sometimes provide context and expertise, not every piece of content that law enforcement would prefer to have taken down should be taken down. It is therefore critical that Meta evaluates these requests independently, particularly when they relate to artistic expression from individuals in minority or marginalized groups for whom the risk of cultural bias against their content is acute.
The channels through which law enforcement makes requests to Meta are haphazard and opaque. Law enforcement agencies are not asked to meet minimum criteria to justify their requests, and interactions therefore lack consistency. The data Meta publishes on government requests is also incomplete.
The lack of transparency around Meta’s relationship with law enforcement creates the potential for the company to amplify bias. A freedom of information request made by the Board revealed that all of the 286 requests the Metropolitan Police made to social media companies and streaming services to review or remove musical content from June 2021 to May 2022 involved drill music, which is particularly popular among young Black British people. 255 of these requests resulted in platforms removing content. 21 requests related to Meta platforms, resulting in 14 content removals. The Board finds that, to honor its values and human rights responsibilities, Meta’s response to law enforcement requests must respect due process and be more transparent.
This case also raises concerns around access to remedy. As part of this case, Meta told the Board that when the company takes content decisions “at escalation,” users cannot appeal to the Board. A decision taken “at escalation” is made by Meta’s internal specialist teams. According to Meta, all decisions on law enforcement requests are made “at escalation” (unless the request is made through a publicly available “in-product reporting tool”), as are decisions on certain policies that can only be applied by Meta's internal teams. This situation adds to concerns raised in preparing the Board’s policy advisory opinion on cross-check where Meta revealed that, between May and June 2022, around a third of content in the cross-check system could not be escalated to the Board.
Meta has referred escalated content to the Board on several occasions, including this one. However, the Board is concerned that users have been denied access to remedy when Meta makes some of its most consequential content decisions. The company must address this problem urgently.
The Oversight Board's Decision
The Oversight Board overturns Meta's decision to remove the content.
The Board recommends that Meta:
- Create a standardized system for receiving content removal requests from state actors. This should include asking for criteria such as how a policy has been violated, and the evidence for this.
- Publish data on state actor content review and removal requests for Community Standard violations.
- Regularly review its data on content moderation decisions prompted by state actor requests to assess for any systemic biases, and create a mechanism to address any bias it identifies.
- Provide users with the opportunity to appeal to the Oversight Board when it makes content decisions “at escalation.”
- Preserve accounts and content penalized or disabled for posting content that is subject to an open investigation by the Board.
- Update its value of “Voice” to reflect the importance of artistic expression.
- Clarify in the Community Standards that for content to be removed as a “veiled threat,” one primary and one secondary signal is required, and make clear which is which.
For Further Information
To read the full decision, click here.
To read a synopsis of public comments for this case, click here.
To read the Metropolitan Police’s response to the Board’s freedom of information request, please click the attachment below.