Oversight Board Announces Cases Involving the Australian Electoral Commission’s Voting Rules
Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.
Case SelectionCase Selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta’s policies.
The cases that we are announcing today are:
Australian Electoral Commission’s Voting Rules
User appeals to restore content to Facebook
Submit public comments, which can be provided anonymously, here.
These two cases concern content decisions made by Meta, both on Facebook, which the Oversight Board intends to address together.
In October 2023, two Facebook users separately posted screenshots showing partial information shared by the Australian Electoral Commission (AEC) on X (formerly Twitter), ahead of the Indigenous Voice to Parliament Referendum. The referendum, held on October 14, asked whether Australia’s Constitution should be amended to recognize the First Peoples of Australia “by establishing a body called the Aboriginal and Torres Strait Islander Voice.” The information shared by the AEC, which is Australia’s electoral body, appears to be part of a longer thread (series of interconnected posts) on X.
The screenshots from the AEC posted by the Facebook users included the following language: “If someone votes at two different polling places within their electorate, and places their formal vote in the ballot box at each polling place, their vote is counted.” They also show another comment from the same thread, which explains that the secrecy of the ballot prevents the AEC from “knowing which ballot paper belongs to which person,” while also reassuring people that “the number of double votes received is incredibly low.” However, the screenshots do not show all the information shared by the AEC, including that voting multiple times is an offence in Australia.
The Facebook user in the first case shared the screenshot in a Facebook group, of which they are the administrator. The accompanying caption in English said: “Vote early, vote often, and vote NO.” The user in the second case posted the same screenshot on their Facebook profile but with lots of text overlay, which included: “so you can vote multiple times … they are setting us up for a ‘rigging’ … smash the voting centres … it’s a No, No, No, No, No.” The caption in the second case included a “stop” emoji followed by the words “Australian Electoral Commission.”
In both cases, Meta proactively identified the posts, which were automatically sent for human review. Following human review, they were both removed for violating Meta’s Coordinating Harm and Promoting Crime policy. Both users then appealed Meta’s decisions to remove their posts. However, due to a technical error, Meta issued Oversight Board reference IDs to the users as soon as the appeals were submitted. This resulted in the users bringing the case to the Board before their appeals were reviewed by Meta. After the Board brought the cases to Meta’s attention, the company confirmed its original decisions to remove the posts were correct.
In their statements to the Board, both users claimed they were posting content from the AEC. The second user additionally asserted that their post served as a “warning to others” that the “election may be fraudulent” for allowing multiple voting.
The Board selected these cases to examine Meta’s content moderation policies and enforcement practices on false or misleading voting information and voter fraud, given the historic number of elections in 2024. These cases fall within the Board’s strategic priority of Elections and Civic Space.
The Board would appreciate public comments that address:
- The sociohistorical context of the 2023 Indigenous Voice to Parliament Referendum in Australia.
- Any relevant context or history of voter fraud in Australia.
- The spread of voter fraud-related content, and false or misleading information about voting, elections and constitutional referenda across social media platforms.
- Content moderation policies and enforcement practices, including fact-checking, on misleading, decontextualized and/or voter fraud-related content.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
Public CommentsPublic Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the link above. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23:59 your local time on Thursday 22 February.
What’s NextWhat’s Next
Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.
To receive updates when the Board announces new cases or publishes decisions, sign up here.