Case Description
In September 2024, a user posted an AI-manipulated video of a person who appears to be Brazilian soccer player Ronaldo Nazário encouraging others to download the Plinco app. Plinco is a popular online game that involves dropping a ball down a peg-filled board, with players winning different prizes based on where the ball lands. The video begins with Ronaldo speaking to the camera. While it appears realistic at first, the audio imitating the soccer star is not in sync with his lip movements. The video then shows AI-generated images of a schoolteacher, a bus driver and a grocery store worker, as well as the average salary for these jobs in Brazil. The audio imitating Ronaldo’s voice claims that Plinco is simple to play, and that average players can earn more money from the game than from these jobs. Finally, the video encourages users to click a link to download Plinco, although this leads to a different game called Bubble Shooter. The post was viewed about 600,000 times.
A user reported the content to Meta as a fraud or scam, but the company did not remove the content. The user then appealed to the Board, stating that Meta failed to warn people about the post’s false information and its use of a public figure in a scam.
As a result of the Board selecting this case, Meta determined that its decision to leave the content up was in error and removed the post for violating its Fraud, Scams and Deceptive Practices Community Standard. According to Meta, the content violates its prohibition on content that “attempts to establish a fake persona or to pretend to be a famous person in an attempt to scam or defraud.” Meta explained that by making it appear as if Ronaldo is promoting an online game through AI, the video attempts to scam people into using a product they might not otherwise download without his endorsement. Content violating this rule can only be removed by Meta’s specialized “escalation” teams, rather than by at-scale reviewers because it requires additional context to enforce. Meta also found the content violates its Spam Community Standard, as the post includes a link that supposedly leads to download Plinco, but instead leads to another game.
The Board’s selection of this case will allow it to examine for the first time the challenges in enforcing Meta’s Frauds, Scams and Deceptive Practices and Spam policies, particularly in the context of online gambling. The Board also aims to assess the impacts of deepfake endorsements on the broader public and the people depicted. This case falls within the Board’s Automated Enforcement of policies and curation of content strategic priority.
The Board would appreciate public comments that address:
- The socioeconomic impact of endorsements by deepfakes imitating public figures on the public and the figure being imitated, especially in Brazil.
- The effectiveness of Meta’s enforcement practices for its policies against scams, specifically for content that contains fake personas and the impersonation of public figures, in Brazil and other regions.
- How Meta’s announcement on January 7, 2025, about ending proactive enforcement of certain policies could impact the amount of deepfake endorsements on Meta’s platforms, particularly in regions where user reports are less frequent.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Thursday 20 February.
What’s Next
Over the next few weeks, Board Members will be deliberating this case. Once they have reached their decision, we will post it on the Decisions page.