Documents Show Guidelines Facebook Moderators Follow to Decide on Content Says Newspaper

The Guardian ‘claims to have had access to a 300-page document that shows examples and rules to be followed by employees and contractors. Users could praise mass murderers or “non-state armed groups” in certain situations.

The British newspaper “The Guardian” published on Tuesday (23) a series of reports describing rules and examples that Facebook content moderators need to follow when deciding what content would be removed from the social network.

The  claims to have had access to a 300-page document that guided employees and contractors responsible for evaluating people’s publications.

Facebook users could praise mass murderers or “non-state armed groups” in certain situations, according to reports.

G1 contacted Facebook about the case, but the company had not taken a position until the last update of this report.

Crimes recognized by Facebook
The leak indicates that Facebook maintains a list of “recognized crimes” and instructs its moderators to distinguish them when applying the rules. This would be done to avoid helping countries where laws are considered incompatible with human rights.

“We recognize only crimes that cause physical, financial or mental harm to individuals”, such as “theft, theft and fraud, murder, vandalism [and] non-consensual sexual touch”, say the guidelines.

Crimes not recognized by Facebook include “sexuality claims”, “peaceful protests against governments” and “discussing historical events / controversial issues, such as religion”.

Attacks on public figures

According to “The Guardian”, the documents indicate that public figures can suffer attacks that are prohibited for other types of people. It would be allowed, for example, to ask for the death of these famous people.

These personalities are defined by their number of followers or their presence in the media. In the guidelines, the social network tells its moderators that they can be considered targets “because we want to allow discussion, which often includes critical comments from people who appear in the news”.

The content needs to be removed if, however, when making the attack the user marks the famous person in the publication.

Zuckerberg worried about disclosing guidelines
In 2018, social network CEO Mark Zuckerbeg explained why Facebook did not disclose the details of which guidelines for moderation.

He said there was “a basic incentive problem” and “when (the content) is not verified, people will disproportionately get involved with the most sensational and provocative content”.

“Our research suggests that no matter where we draw the boundaries for what is allowed, as part of the content approaches that line, people will be more involved with it on average – even when they tell us afterwards that they don’t like the content,” said Zuckerberg.

Independent social network committee
In 2020, Facebook created the Supervisory Committee, an independent body that received an investment of $ 130 million to function as a high court on content moderation.

The initiative is a response to criticism about the way the company moderates publications on Facebook and Instagram, apps that belong to the same company.

According to the committee’s bylaws, the decisions made by the group are final, which means that the networks are obliged to comply with what is decided.