Oversight Board Overturns Original Facebook Decision in Punjabi Concern Over the RSS in India Case
April 29, 2021
The Oversight Board has overturned Facebook’s decision to remove a post under its Dangerous Individuals and Organizations Community Standard. After the Board identified this case for review, Facebook restored the content. The Board expressed concerns that Facebook did not review the user’s appeal against its original decision. The Board also urged the company to take action to avoid mistakes which silence the voices of religious minorities.
About the case
In November 2020, a user shared a video post from Punjabi-language online media company Global Punjab TV. This featured a 17-minute interview with Professor Manjit Singh who is described as “a social activist and supporter of the Punjabi culture.” The post also included a caption mentioning Hindu nationalist organization Rashtriya Swayamsevak Sangh (RSS) and India’s ruling party Bharatiya Janata Party (BJP): “RSS is the new threat. Ram Naam Satya Hai. The BJP moved towards extremism.”
In text accompanying the post, the user claimed the RSS was threatening to kill Sikhs, a minority religious group in India, and to repeat the “deadly saga” of 1984 when Hindu mobs massacred and burned Sikh men, women and children. The user alleged that Prime Minister Modi himself is formulating the threat of “Genocide of the Sikhs” on advice of the RSS President, Mohan Bhagwat. The user also claimed that Sikh regiments in the army have warned Prime Minister Modi of their willingness to die to protect the Sikh farmers and their land in Punjab.
After being reported by one user, a human reviewer determined that the post violated Facebook’s Dangerous Individuals and Organizations Community Standard and removed it. This triggered an automatic restriction on the user’s account. Facebook told the user that they could not review their appeal of the removal because of a temporary reduction in review capacity due to COVID-19.
Key findings
After the Board identified this case for review, but prior to it being assigned to a panel, Facebook realized that the content was removed in error and restored it. Facebook noted that none of the groups or individuals mentioned in the content are designated as “dangerous” under its rules. The company also could not identify the specific words in the post which led to it being removed in error.
The Board found that Facebook’s original decision to remove the post was not consistent with the company’s Community Standards or its human rights responsibilities.
The Board noted that the post highlighted the concerns of minority and opposition voices in India that are allegedly being discriminated against by the government. It is particularly important that Facebook takes steps to avoid mistakes which silence such voices. While recognizing the unique circumstances of COVID-19, the Board argued that Facebook did not give adequate time or attention to reviewing this content. It stressed that users should be able to appeal cases to Facebook before they come to the Board and urged the company to prioritize restoring this capacity.
Considering the above, the Board found the account restrictions that excluded the user from Facebook particularly disproportionate. It also expressed concerns that Facebook’s rules on such restrictions are spread across many locations and not all found in the Community Standards, as one would expect.
Finally, the Board noted that Facebook’s transparency reporting makes it difficult to assess whether enforcement of the Dangerous Individuals and Organizations policy has a particular impact on minority language speakers or religious minorities in India.
The Oversight Board’s decision
The Board overturns Facebook’s original decision to remove the content. In a policy advisory statement, the Board recommends that Facebook:
- Translate its Community Standards and Internal Implementation Standards into Punjabi. Facebook should also aim to make its Community Standards accessible in all languages widely spoken by its users.
- Restore both human review of content moderation decisions and access to a human appeals process to pre-pandemic levels as soon as possible, while protecting the health of Facebook’s staff and contractors.
- Increase public information on error rates by making this viewable by country and language for each Community Standard in its transparency reporting.
For further information:
To read the full case decision, click here.
To read a synopsis of public comments for this case, click here.