Ever wondered how and on what principles does the social media giant, Facebook deals with the graphic content that is reported to the moderators? The Guardian has published a series named “Facebook Files,” which focuses on the guidelines that are followed by Facebook whenever a graphic violence or any other such disturbing content is reported to the Facebook moderators.

pic 246 Heres how Facebook deals with the graphic content   The Guardians report exposed the guidelines that the companys moderators use to handle any of the disturbing content
Guidelines Seem Contradictory

According to the Facebook Files, the guidelines used for the said purpose somehow seem contradictory in some situations. As facebook is a widely used medium for communication and cases for graphic violence, cruelty to animals, threats of violence, and even non-sexual child abuse are often reported to the company’s moderators. The Guardian has dug deep into the issue and has reviewed more than 100 internal “training manuals, spreadsheets, and flowcharts.” All this material becomes the basis for decisions taken by the company regarding the matter so the report seems pretty credible. A statement has been found disturbing in these documents which goes like:

provide a platform for free speech while also avoiding real-world harm.

Facebook’s Proactive Approach

It’s not like Facebook stays away from such matters unless it is reported rather, the company’s proactive approach is based on such algos that automatically remove any of such disturbing content. These automatic algos can combat with content such as child sexual abuse or terrorism. But obviously, there is much more that needs to be addressed and here’s the spot where moderators jump into the scene.

Example’s May Sound Funny To You

pic 39 Heres how Facebook deals with the graphic content   The Guardians report exposed the guidelines that the companys moderators use to handle any of the disturbing content

While digging deep into the problem, we came to know that some of the statements mentioned in the guidelines even sound funny. For instance, “I’m going to kill you John!” is totally acceptable to go live on Facebook while “I’m going to kill you John, I have the perfect knife to do it!” is not.
It may sound funny but actually, it is pretty hard sometimes to determine whether a statement is just a joke or a serious threat. Moreover, Facebook acts against groups such as homeless or Zionists and removes them automatically.

Snap Begins The Hardware Method As Facebook Prepares "Snap" Like Functionality In All Its Applications

Some Of The Disturbing Content Is Allowed Intentionally

You might think of some of the disturbing content such as clips from animal abuse etc. in the documentaries that you have seen on Facebook. The social media site keeps it purely for the purpose of awareness among the people and hence some part of such disturbing content is allowed. Another interesting fact is that Facebook allows content that shows someone trying to harm him/herself because of the reason that the company “doesn’t want to censor or punish people in distress who are attempting suicide.”